Oct 12 20:24:10 crc systemd[1]: Starting Kubernetes Kubelet... Oct 12 20:24:10 crc restorecon[4632]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:10 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 20:24:11 crc restorecon[4632]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 12 20:24:12 crc kubenswrapper[4773]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 20:24:12 crc kubenswrapper[4773]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 12 20:24:12 crc kubenswrapper[4773]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 20:24:12 crc kubenswrapper[4773]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 20:24:12 crc kubenswrapper[4773]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 12 20:24:12 crc kubenswrapper[4773]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.196534 4773 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203161 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203189 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203197 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203207 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203214 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203223 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203271 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203283 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203291 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203298 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203304 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203310 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203315 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203321 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203327 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203333 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203340 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203347 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203353 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203360 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203367 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203373 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203380 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203388 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203396 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203403 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203409 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203416 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203423 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203430 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203437 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203443 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203450 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203457 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203464 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203470 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203477 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203488 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203500 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203507 4773 feature_gate.go:330] unrecognized feature gate: Example Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203515 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203523 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203530 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203537 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203544 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203551 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203558 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203565 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203571 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203578 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203585 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203596 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203605 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203612 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203618 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203625 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203633 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203639 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203646 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203652 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203659 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203665 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203672 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203678 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203684 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203691 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203698 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203773 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203781 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203787 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.203797 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204695 4773 flags.go:64] FLAG: --address="0.0.0.0" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204736 4773 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204752 4773 flags.go:64] FLAG: --anonymous-auth="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204762 4773 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204772 4773 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204780 4773 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204791 4773 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204801 4773 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204810 4773 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204818 4773 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204826 4773 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204837 4773 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204845 4773 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204853 4773 flags.go:64] FLAG: --cgroup-root="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204861 4773 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204869 4773 flags.go:64] FLAG: --client-ca-file="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204877 4773 flags.go:64] FLAG: --cloud-config="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204885 4773 flags.go:64] FLAG: --cloud-provider="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204892 4773 flags.go:64] FLAG: --cluster-dns="[]" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204903 4773 flags.go:64] FLAG: --cluster-domain="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204910 4773 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204919 4773 flags.go:64] FLAG: --config-dir="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204927 4773 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204936 4773 flags.go:64] FLAG: --container-log-max-files="5" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204946 4773 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204953 4773 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204962 4773 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204970 4773 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204978 4773 flags.go:64] FLAG: --contention-profiling="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204986 4773 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.204994 4773 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205004 4773 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205013 4773 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205024 4773 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205032 4773 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205039 4773 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205047 4773 flags.go:64] FLAG: --enable-load-reader="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205055 4773 flags.go:64] FLAG: --enable-server="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205062 4773 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205073 4773 flags.go:64] FLAG: --event-burst="100" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205081 4773 flags.go:64] FLAG: --event-qps="50" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205089 4773 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205097 4773 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205105 4773 flags.go:64] FLAG: --eviction-hard="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205115 4773 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205123 4773 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205131 4773 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205140 4773 flags.go:64] FLAG: --eviction-soft="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205148 4773 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205155 4773 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205163 4773 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205171 4773 flags.go:64] FLAG: --experimental-mounter-path="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205179 4773 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205187 4773 flags.go:64] FLAG: --fail-swap-on="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205194 4773 flags.go:64] FLAG: --feature-gates="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205205 4773 flags.go:64] FLAG: --file-check-frequency="20s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205213 4773 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205221 4773 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205231 4773 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205240 4773 flags.go:64] FLAG: --healthz-port="10248" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205249 4773 flags.go:64] FLAG: --help="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205258 4773 flags.go:64] FLAG: --hostname-override="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205266 4773 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205275 4773 flags.go:64] FLAG: --http-check-frequency="20s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205284 4773 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205291 4773 flags.go:64] FLAG: --image-credential-provider-config="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205299 4773 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205307 4773 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205316 4773 flags.go:64] FLAG: --image-service-endpoint="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205324 4773 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205332 4773 flags.go:64] FLAG: --kube-api-burst="100" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205340 4773 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205349 4773 flags.go:64] FLAG: --kube-api-qps="50" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205357 4773 flags.go:64] FLAG: --kube-reserved="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205365 4773 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205373 4773 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205381 4773 flags.go:64] FLAG: --kubelet-cgroups="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205388 4773 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205397 4773 flags.go:64] FLAG: --lock-file="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205405 4773 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205414 4773 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205422 4773 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205435 4773 flags.go:64] FLAG: --log-json-split-stream="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205443 4773 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205451 4773 flags.go:64] FLAG: --log-text-split-stream="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205459 4773 flags.go:64] FLAG: --logging-format="text" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205467 4773 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205475 4773 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205483 4773 flags.go:64] FLAG: --manifest-url="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205492 4773 flags.go:64] FLAG: --manifest-url-header="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205503 4773 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205511 4773 flags.go:64] FLAG: --max-open-files="1000000" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205521 4773 flags.go:64] FLAG: --max-pods="110" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205529 4773 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205537 4773 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205547 4773 flags.go:64] FLAG: --memory-manager-policy="None" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205555 4773 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205563 4773 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205571 4773 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205579 4773 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205598 4773 flags.go:64] FLAG: --node-status-max-images="50" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205606 4773 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205614 4773 flags.go:64] FLAG: --oom-score-adj="-999" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205623 4773 flags.go:64] FLAG: --pod-cidr="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205631 4773 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205643 4773 flags.go:64] FLAG: --pod-manifest-path="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205651 4773 flags.go:64] FLAG: --pod-max-pids="-1" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205659 4773 flags.go:64] FLAG: --pods-per-core="0" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205666 4773 flags.go:64] FLAG: --port="10250" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205675 4773 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205683 4773 flags.go:64] FLAG: --provider-id="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205691 4773 flags.go:64] FLAG: --qos-reserved="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205768 4773 flags.go:64] FLAG: --read-only-port="10255" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205779 4773 flags.go:64] FLAG: --register-node="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205786 4773 flags.go:64] FLAG: --register-schedulable="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205795 4773 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205817 4773 flags.go:64] FLAG: --registry-burst="10" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205825 4773 flags.go:64] FLAG: --registry-qps="5" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205832 4773 flags.go:64] FLAG: --reserved-cpus="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205840 4773 flags.go:64] FLAG: --reserved-memory="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205894 4773 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205903 4773 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205911 4773 flags.go:64] FLAG: --rotate-certificates="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205919 4773 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205926 4773 flags.go:64] FLAG: --runonce="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205934 4773 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205943 4773 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205950 4773 flags.go:64] FLAG: --seccomp-default="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205959 4773 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205966 4773 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205974 4773 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205982 4773 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205990 4773 flags.go:64] FLAG: --storage-driver-password="root" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.205998 4773 flags.go:64] FLAG: --storage-driver-secure="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206006 4773 flags.go:64] FLAG: --storage-driver-table="stats" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206014 4773 flags.go:64] FLAG: --storage-driver-user="root" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206022 4773 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206031 4773 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206039 4773 flags.go:64] FLAG: --system-cgroups="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206046 4773 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206060 4773 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206069 4773 flags.go:64] FLAG: --tls-cert-file="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206077 4773 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206086 4773 flags.go:64] FLAG: --tls-min-version="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206094 4773 flags.go:64] FLAG: --tls-private-key-file="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206102 4773 flags.go:64] FLAG: --topology-manager-policy="none" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206109 4773 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206117 4773 flags.go:64] FLAG: --topology-manager-scope="container" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206125 4773 flags.go:64] FLAG: --v="2" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206136 4773 flags.go:64] FLAG: --version="false" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206147 4773 flags.go:64] FLAG: --vmodule="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206156 4773 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206165 4773 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206415 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206426 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206435 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206442 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206451 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206459 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206469 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206478 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206485 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206493 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206501 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206509 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206517 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206524 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206532 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206539 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206546 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206553 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206560 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206570 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206578 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206586 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206593 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206602 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206609 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206616 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206623 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206630 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206637 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206644 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206651 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206658 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206665 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206672 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206679 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206686 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206693 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206699 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206706 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206737 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206745 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206752 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206759 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206766 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206772 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206780 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206787 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206793 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206800 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206806 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206813 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206820 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206830 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206839 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206847 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206854 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206864 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206873 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206880 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206889 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206897 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206905 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206913 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206920 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206927 4773 feature_gate.go:330] unrecognized feature gate: Example Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206934 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206941 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206949 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206956 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206963 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.206971 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.206991 4773 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.216386 4773 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.216439 4773 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216531 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216544 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216552 4773 feature_gate.go:330] unrecognized feature gate: Example Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216556 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216561 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216566 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216573 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216583 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216589 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216595 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216600 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216605 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216610 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216615 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216619 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216624 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216628 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216633 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216638 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216642 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216647 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216652 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216656 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216662 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216668 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216672 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216676 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216681 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216686 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216691 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216696 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216702 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216707 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216732 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216740 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216745 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216750 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216754 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216758 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216764 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216769 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216775 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216781 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216786 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216790 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216794 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216798 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216803 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216807 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216811 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216815 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216820 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216824 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216830 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216836 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216841 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216846 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216851 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216856 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216861 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216865 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216871 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216876 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216881 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216886 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216891 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216895 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216900 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216904 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216908 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.216914 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.216923 4773 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217097 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217109 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217116 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217122 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217128 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217132 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217137 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217142 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217147 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217151 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217155 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217160 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217164 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217168 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217173 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217177 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217181 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217186 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217190 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217194 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217198 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217202 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217207 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217211 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217219 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217224 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217229 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217234 4773 feature_gate.go:330] unrecognized feature gate: Example Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217239 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217245 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217250 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217256 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217260 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217264 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217270 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217274 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217278 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217283 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217287 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217291 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217295 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217298 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217302 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217306 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217310 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217314 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217317 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217321 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217325 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217328 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217332 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217336 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217340 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217344 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217348 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217354 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217359 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217364 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217369 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217374 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217378 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217382 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217386 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217391 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217394 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217399 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217404 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217409 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217414 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217419 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.217424 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.217432 4773 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.218469 4773 server.go:940] "Client rotation is on, will bootstrap in background" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.222468 4773 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.223168 4773 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.224727 4773 server.go:997] "Starting client certificate rotation" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.224755 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.225066 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-10 17:43:40.065112503 +0000 UTC Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.225190 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2157h19m27.839930266s for next certificate rotation Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.256457 4773 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.260704 4773 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.278546 4773 log.go:25] "Validated CRI v1 runtime API" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.323849 4773 log.go:25] "Validated CRI v1 image API" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.326495 4773 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.334866 4773 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-12-09-02-12-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.334912 4773 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.356786 4773 manager.go:217] Machine: {Timestamp:2025-10-12 20:24:12.353696405 +0000 UTC m=+0.589995055 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199476736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75 BootID:ee41ac78-6c3d-4e51-9248-43b3278b77da Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:87:cc:68 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:87:cc:68 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ff:13:47 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5a:18:73 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2c:3b:0c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:d6:36:fe Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:28:7b:58 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:fe:f9:16:ef:fa:49 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:52:e2:24:87:51:4d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199476736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.357140 4773 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.357359 4773 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.359137 4773 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.359503 4773 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.359562 4773 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.359962 4773 topology_manager.go:138] "Creating topology manager with none policy" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.359983 4773 container_manager_linux.go:303] "Creating device plugin manager" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.360516 4773 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.360585 4773 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.360986 4773 state_mem.go:36] "Initialized new in-memory state store" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.361166 4773 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.369126 4773 kubelet.go:418] "Attempting to sync node with API server" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.369177 4773 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.369254 4773 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.369278 4773 kubelet.go:324] "Adding apiserver pod source" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.369297 4773 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.382103 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.382263 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.382304 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.383535 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.383591 4773 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.384740 4773 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.386419 4773 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388007 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388051 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388067 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388082 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388107 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388122 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388136 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388160 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388177 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388193 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388211 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.388226 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.390256 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.391043 4773 server.go:1280] "Started kubelet" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.391976 4773 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.392272 4773 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.392837 4773 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.392843 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:12 crc systemd[1]: Started Kubernetes Kubelet. Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.397053 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.397252 4773 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.397450 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 19:33:52.384561543 +0000 UTC Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.397599 4773 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.397613 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1463h9m39.986956451s for next certificate rotation Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.397586 4773 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.397176 4773 server.go:460] "Adding debug handlers to kubelet server" Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.398227 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.398294 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.398388 4773 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.398521 4773 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.401636 4773 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.401671 4773 factory.go:55] Registering systemd factory Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.401691 4773 factory.go:221] Registration of the systemd container factory successfully Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.403423 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.202:6443: connect: connection refused" interval="200ms" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.406046 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.406108 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.407218 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.202:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186dd81deba74fdf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-12 20:24:12.390993887 +0000 UTC m=+0.627292487,LastTimestamp:2025-10-12 20:24:12.390993887 +0000 UTC m=+0.627292487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.410786 4773 factory.go:153] Registering CRI-O factory Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.410822 4773 factory.go:221] Registration of the crio container factory successfully Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.410870 4773 factory.go:103] Registering Raw factory Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.410895 4773 manager.go:1196] Started watching for new ooms in manager Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411092 4773 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411164 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411192 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411213 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411234 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411254 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411275 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411300 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411320 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411340 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411363 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411390 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411454 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.412333 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.412375 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.412404 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.412435 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.412461 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.413469 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.413505 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.413551 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.413576 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.413610 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.413637 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.413662 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414166 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414238 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414268 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414307 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414336 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414362 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414394 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414417 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414448 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414473 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414495 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414525 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414547 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414575 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414598 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414622 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414697 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.414779 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.415684 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.415951 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.416154 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.416328 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.416489 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.416649 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.417652 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.417893 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.418055 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.418194 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.418334 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.418456 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.418589 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.418772 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.418921 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.419039 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.419171 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.419298 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.419425 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.419556 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.419671 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.419835 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.419962 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.420088 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.420219 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.420335 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.420454 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.420581 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.420694 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.411857 4773 manager.go:319] Starting recovery of all containers Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421038 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421128 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421172 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421196 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421218 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421248 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421272 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421306 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421331 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421352 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421382 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421406 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421435 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421457 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421479 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421529 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421553 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421581 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421601 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421623 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421651 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421673 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421699 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421746 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421792 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421820 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421841 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421878 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421932 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421958 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.421986 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422033 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422060 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422092 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422123 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422149 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422179 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422207 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422230 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422258 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422282 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422311 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422334 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422360 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422380 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422400 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422425 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422447 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422473 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422493 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422513 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422659 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422682 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422709 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422761 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422782 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422806 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422827 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422852 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422873 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422894 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422919 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422937 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422963 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.422981 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423000 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423022 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423041 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423066 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423088 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423109 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423136 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423157 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423183 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423771 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423816 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423862 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423892 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423928 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423958 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.423986 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424024 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424054 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424091 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424120 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424148 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424181 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424209 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424244 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424273 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424303 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424343 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424369 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424404 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424431 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424464 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424499 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424527 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424560 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424587 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424613 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424647 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424674 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424701 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424861 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424893 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424933 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.424968 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425043 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425074 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425113 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425143 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425172 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425209 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425236 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425273 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425302 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425329 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425364 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425391 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425428 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425457 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425487 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425525 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425553 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425594 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425621 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425652 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425774 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425890 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425922 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425955 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.425986 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.426013 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.426040 4773 reconstruct.go:97] "Volume reconstruction finished" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.426060 4773 reconciler.go:26] "Reconciler: start to sync state" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.449470 4773 manager.go:324] Recovery completed Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.461437 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.463176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.463247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.463268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.465214 4773 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.465242 4773 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.465277 4773 state_mem.go:36] "Initialized new in-memory state store" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.476135 4773 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.479742 4773 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.479790 4773 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.479835 4773 kubelet.go:2335] "Starting kubelet main sync loop" Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.479988 4773 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.481004 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.481096 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.482637 4773 policy_none.go:49] "None policy: Start" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.487116 4773 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.487163 4773 state_mem.go:35] "Initializing new in-memory state store" Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.498887 4773 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.558334 4773 manager.go:334] "Starting Device Plugin manager" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.560750 4773 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.560793 4773 server.go:79] "Starting device plugin registration server" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.561310 4773 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.561337 4773 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.561784 4773 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.561919 4773 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.561940 4773 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.574528 4773 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.580854 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.580937 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.582565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.582604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.582618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.582851 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.583342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.583423 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.584103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.584215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.584308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.584624 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.584806 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.584845 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.584961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.585022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.585094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.585741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.585782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.585800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.587276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.587856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.587925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.588096 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.588266 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.588345 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.589318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.589354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.589370 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.589506 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.589787 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.589924 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.590128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.590163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.590173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.590395 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.590415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.590426 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.590581 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.590621 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.591618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.591664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.591677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.592361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.592395 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.592436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.604080 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.202:6443: connect: connection refused" interval="400ms" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630015 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630064 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630092 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630113 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630158 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630177 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630195 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630213 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630233 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630253 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630270 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630294 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630317 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.630346 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.661611 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.662598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.662624 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.662632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.662654 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.663053 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.202:6443: connect: connection refused" node="crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731455 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731513 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731536 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731550 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731570 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731631 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731660 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731672 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731762 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731683 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731805 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731689 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731864 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731847 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731866 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731883 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731939 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731973 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.731991 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.732031 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.732088 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.732134 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.863752 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.864942 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.864976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.864989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.865012 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 20:24:12 crc kubenswrapper[4773]: E1012 20:24:12.865341 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.202:6443: connect: connection refused" node="crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.916077 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.941073 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.949246 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.960550 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2ffb8e98d13c88fa42697b5460d837e820bf83e6ae22c43552b96196a2b26c54 WatchSource:0}: Error finding container 2ffb8e98d13c88fa42697b5460d837e820bf83e6ae22c43552b96196a2b26c54: Status 404 returned error can't find the container with id 2ffb8e98d13c88fa42697b5460d837e820bf83e6ae22c43552b96196a2b26c54 Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.973471 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: I1012 20:24:12.981237 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.981460 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b8bcaee6c4408f77a205f4d236a34f7229584de26c4fd050904a2c6363e4db31 WatchSource:0}: Error finding container b8bcaee6c4408f77a205f4d236a34f7229584de26c4fd050904a2c6363e4db31: Status 404 returned error can't find the container with id b8bcaee6c4408f77a205f4d236a34f7229584de26c4fd050904a2c6363e4db31 Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.984865 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d71cec8c8fea15fda80a3e3f7adb578d9901c3bab3167cf3a1b927ddb280631c WatchSource:0}: Error finding container d71cec8c8fea15fda80a3e3f7adb578d9901c3bab3167cf3a1b927ddb280631c: Status 404 returned error can't find the container with id d71cec8c8fea15fda80a3e3f7adb578d9901c3bab3167cf3a1b927ddb280631c Oct 12 20:24:12 crc kubenswrapper[4773]: W1012 20:24:12.996076 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-98144faa7d3935cf16ac932f04aeca601fcbd74aa41a728f206237863fb6d557 WatchSource:0}: Error finding container 98144faa7d3935cf16ac932f04aeca601fcbd74aa41a728f206237863fb6d557: Status 404 returned error can't find the container with id 98144faa7d3935cf16ac932f04aeca601fcbd74aa41a728f206237863fb6d557 Oct 12 20:24:13 crc kubenswrapper[4773]: E1012 20:24:13.005353 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.202:6443: connect: connection refused" interval="800ms" Oct 12 20:24:13 crc kubenswrapper[4773]: W1012 20:24:13.007988 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f4b8c730c152880546f3ce53b60467dccf97265f1e77250dfc4e1439e24197aa WatchSource:0}: Error finding container f4b8c730c152880546f3ce53b60467dccf97265f1e77250dfc4e1439e24197aa: Status 404 returned error can't find the container with id f4b8c730c152880546f3ce53b60467dccf97265f1e77250dfc4e1439e24197aa Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.266143 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.267431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.267479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.267492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.267527 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 20:24:13 crc kubenswrapper[4773]: E1012 20:24:13.267921 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.202:6443: connect: connection refused" node="crc" Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.393781 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.485063 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f4b8c730c152880546f3ce53b60467dccf97265f1e77250dfc4e1439e24197aa"} Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.486057 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"98144faa7d3935cf16ac932f04aeca601fcbd74aa41a728f206237863fb6d557"} Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.487864 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d71cec8c8fea15fda80a3e3f7adb578d9901c3bab3167cf3a1b927ddb280631c"} Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.489184 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b8bcaee6c4408f77a205f4d236a34f7229584de26c4fd050904a2c6363e4db31"} Oct 12 20:24:13 crc kubenswrapper[4773]: I1012 20:24:13.491247 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2ffb8e98d13c88fa42697b5460d837e820bf83e6ae22c43552b96196a2b26c54"} Oct 12 20:24:13 crc kubenswrapper[4773]: W1012 20:24:13.519179 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:13 crc kubenswrapper[4773]: E1012 20:24:13.519256 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:13 crc kubenswrapper[4773]: W1012 20:24:13.737581 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:13 crc kubenswrapper[4773]: E1012 20:24:13.738015 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:13 crc kubenswrapper[4773]: E1012 20:24:13.806848 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.202:6443: connect: connection refused" interval="1.6s" Oct 12 20:24:13 crc kubenswrapper[4773]: W1012 20:24:13.859641 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:13 crc kubenswrapper[4773]: E1012 20:24:13.859699 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:13 crc kubenswrapper[4773]: W1012 20:24:13.952872 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:13 crc kubenswrapper[4773]: E1012 20:24:13.953013 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.068334 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.070639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.070682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.070697 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.070747 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 20:24:14 crc kubenswrapper[4773]: E1012 20:24:14.071145 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.202:6443: connect: connection refused" node="crc" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.394317 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.496325 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884" exitCode=0 Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.496469 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884"} Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.496537 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.497828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.497883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.497903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.499563 4773 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf" exitCode=0 Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.499704 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.500215 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf"} Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.500736 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.500792 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.500806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.502680 4773 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6" exitCode=0 Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.502817 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6"} Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.502846 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.504000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.504031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.504042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.506857 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2"} Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.506902 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.506909 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a"} Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.507239 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502"} Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.507261 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4"} Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.508119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.508159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.508201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.510675 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d" exitCode=0 Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.510738 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d"} Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.510829 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.511915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.511967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.511985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.513824 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.514886 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.514940 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:14 crc kubenswrapper[4773]: I1012 20:24:14.514957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.090513 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.393900 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:15 crc kubenswrapper[4773]: E1012 20:24:15.407693 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.202:6443: connect: connection refused" interval="3.2s" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.518811 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.518855 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.518866 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.518878 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.518889 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.518892 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.519787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.519816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.519824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.521791 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776" exitCode=0 Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.521897 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.522242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.522537 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.522564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.522572 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.524636 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"54bb5a373141b25b2b8a2e3e3f1ee55b22d419210354ff015f9f188a44eb74be"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.524695 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.525466 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.525488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.525497 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.527478 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.527790 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.528056 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.528072 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.528082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e"} Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.528364 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.528382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.528389 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.532939 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.532976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.532991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:15 crc kubenswrapper[4773]: W1012 20:24:15.630770 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:15 crc kubenswrapper[4773]: E1012 20:24:15.630874 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.672032 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.672932 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.672963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.672973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:15 crc kubenswrapper[4773]: I1012 20:24:15.672996 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 20:24:15 crc kubenswrapper[4773]: E1012 20:24:15.673338 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.202:6443: connect: connection refused" node="crc" Oct 12 20:24:15 crc kubenswrapper[4773]: W1012 20:24:15.726355 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.202:6443: connect: connection refused Oct 12 20:24:15 crc kubenswrapper[4773]: E1012 20:24:15.726442 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.202:6443: connect: connection refused" logger="UnhandledError" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.532776 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4" exitCode=0 Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.532871 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.532904 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.532944 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.532975 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.533009 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.533075 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.533559 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.534154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4"} Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535227 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.535629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.537409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.537434 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:16 crc kubenswrapper[4773]: I1012 20:24:16.537442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.021540 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.214857 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.494347 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.541249 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760"} Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.541299 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425"} Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.541319 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d"} Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.541334 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f"} Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.541335 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.541459 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.542973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.542999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.543006 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.543036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.543051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:17 crc kubenswrapper[4773]: I1012 20:24:17.543100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.341828 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.551070 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e"} Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.551154 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.551742 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.552357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.552411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.552431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.552972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.553024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.553044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.874286 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.876115 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.876200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.876226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:18 crc kubenswrapper[4773]: I1012 20:24:18.876274 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.173648 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.173938 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.175564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.175931 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.175960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.554188 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.554254 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.555686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.555786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.555805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.555690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.555855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:19 crc kubenswrapper[4773]: I1012 20:24:19.555880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:21 crc kubenswrapper[4773]: I1012 20:24:21.209034 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 12 20:24:21 crc kubenswrapper[4773]: I1012 20:24:21.209294 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:21 crc kubenswrapper[4773]: I1012 20:24:21.211217 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:21 crc kubenswrapper[4773]: I1012 20:24:21.211294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:21 crc kubenswrapper[4773]: I1012 20:24:21.211318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.304224 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.304487 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.306128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.306183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.306202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:22 crc kubenswrapper[4773]: E1012 20:24:22.575399 4773 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.684122 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.684320 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.685911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.685995 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.686016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:22 crc kubenswrapper[4773]: I1012 20:24:22.692627 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.505112 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.505340 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.506875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.506933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.506951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.566114 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.567779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.567838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.567856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:23 crc kubenswrapper[4773]: I1012 20:24:23.574330 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:24 crc kubenswrapper[4773]: I1012 20:24:24.568757 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:24 crc kubenswrapper[4773]: I1012 20:24:24.569784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:24 crc kubenswrapper[4773]: I1012 20:24:24.569825 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:24 crc kubenswrapper[4773]: I1012 20:24:24.569835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:25 crc kubenswrapper[4773]: I1012 20:24:25.304777 4773 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 12 20:24:25 crc kubenswrapper[4773]: I1012 20:24:25.304884 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.395327 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 12 20:24:26 crc kubenswrapper[4773]: W1012 20:24:26.515458 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.515554 4773 trace.go:236] Trace[1260012078]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 20:24:16.514) (total time: 10001ms): Oct 12 20:24:26 crc kubenswrapper[4773]: Trace[1260012078]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:24:26.515) Oct 12 20:24:26 crc kubenswrapper[4773]: Trace[1260012078]: [10.001301008s] [10.001301008s] END Oct 12 20:24:26 crc kubenswrapper[4773]: E1012 20:24:26.515585 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.575353 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.577533 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a" exitCode=255 Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.577594 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a"} Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.577813 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.578676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.578740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.578754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.579408 4773 scope.go:117] "RemoveContainer" containerID="2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a" Oct 12 20:24:26 crc kubenswrapper[4773]: W1012 20:24:26.586487 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 12 20:24:26 crc kubenswrapper[4773]: I1012 20:24:26.586575 4773 trace.go:236] Trace[1347941735]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 20:24:16.584) (total time: 10001ms): Oct 12 20:24:26 crc kubenswrapper[4773]: Trace[1347941735]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:24:26.586) Oct 12 20:24:26 crc kubenswrapper[4773]: Trace[1347941735]: [10.001729936s] [10.001729936s] END Oct 12 20:24:26 crc kubenswrapper[4773]: E1012 20:24:26.586598 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.021960 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.022044 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.280291 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.280662 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.583207 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.586168 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5"} Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.586426 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.587773 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.587823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:27 crc kubenswrapper[4773]: I1012 20:24:27.587838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.379585 4773 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.387640 4773 apiserver.go:52] "Watching apiserver" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.394778 4773 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.395124 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.395563 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.395863 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.396027 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.396038 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.396156 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:30 crc kubenswrapper[4773]: E1012 20:24:30.396242 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.396287 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:30 crc kubenswrapper[4773]: E1012 20:24:30.396422 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:30 crc kubenswrapper[4773]: E1012 20:24:30.396522 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.398198 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.398616 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.398639 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.398800 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.398882 4773 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.402261 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.402924 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.402948 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.403207 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.403339 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.444369 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.466527 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.488432 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.508498 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.523504 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.538752 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:30 crc kubenswrapper[4773]: I1012 20:24:30.552060 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:31 crc kubenswrapper[4773]: I1012 20:24:31.481043 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:31 crc kubenswrapper[4773]: E1012 20:24:31.481283 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:31 crc kubenswrapper[4773]: I1012 20:24:31.913523 4773 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.028567 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.029025 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.037707 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.042381 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.045206 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.054426 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.066261 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.078206 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.088640 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.108126 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.122897 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.139200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.159636 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.183786 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.206880 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.218652 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.229494 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.273352 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.276417 4773 trace.go:236] Trace[1374551358]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 20:24:21.585) (total time: 10690ms): Oct 12 20:24:32 crc kubenswrapper[4773]: Trace[1374551358]: ---"Objects listed" error: 10690ms (20:24:32.276) Oct 12 20:24:32 crc kubenswrapper[4773]: Trace[1374551358]: [10.690954859s] [10.690954859s] END Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.276442 4773 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.276997 4773 trace.go:236] Trace[1100292613]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 20:24:21.160) (total time: 11116ms): Oct 12 20:24:32 crc kubenswrapper[4773]: Trace[1100292613]: ---"Objects listed" error: 11116ms (20:24:32.276) Oct 12 20:24:32 crc kubenswrapper[4773]: Trace[1100292613]: [11.116324034s] [11.116324034s] END Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.277020 4773 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.278519 4773 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.279514 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379465 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379521 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379543 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379577 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379598 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379614 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379630 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379645 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379664 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379679 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379734 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379751 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379785 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379821 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379852 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379866 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379881 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379899 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379915 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379934 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379950 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379953 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379949 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.379967 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380059 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380082 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380101 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380119 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380135 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380151 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380165 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380181 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380198 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380213 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380229 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380247 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380291 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380307 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380324 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380339 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380362 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380379 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380395 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380409 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380418 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380427 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380442 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380459 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380475 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380489 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380509 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380525 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380540 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380576 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380590 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380607 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380622 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380636 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380651 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380665 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380683 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380701 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380727 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380743 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380758 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380773 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380789 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380805 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380820 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380836 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380885 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380904 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380921 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380969 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380986 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381004 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381020 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381059 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381074 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381089 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381106 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381123 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381141 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381157 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381171 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381200 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381229 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381244 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381259 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381275 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381291 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381308 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381322 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381339 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381354 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381371 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381389 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381406 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381421 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381437 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381452 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381467 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381482 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381497 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381513 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381529 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381544 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381563 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381578 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381594 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381640 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381656 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381670 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381687 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381703 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381731 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381748 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381766 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381784 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381799 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381815 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381872 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381889 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381905 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381936 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381953 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381968 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382000 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382017 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382033 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382051 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382066 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382082 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382098 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382113 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382163 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382180 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382196 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382213 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382231 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382248 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382264 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382281 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382297 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382313 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382329 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382344 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382359 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382376 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382393 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382411 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382427 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382443 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382460 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382477 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382499 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382518 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382536 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382554 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382571 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382587 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382605 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382621 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382638 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382655 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382672 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382708 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382740 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382765 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382787 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382805 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382825 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382840 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382856 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382873 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382890 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382906 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382925 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382943 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382959 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382978 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382997 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383016 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383031 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383047 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383064 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383081 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383098 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383115 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383131 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383168 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383189 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383208 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383246 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383264 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383282 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383320 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383340 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383373 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383405 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383425 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383443 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383487 4773 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383499 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.383509 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380481 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380686 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380732 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.380901 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381084 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381097 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381249 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381391 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381518 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381547 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381554 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381686 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381713 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.395382 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.395637 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.395810 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.396124 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.396398 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.396550 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.396952 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.397079 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.397186 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381994 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382079 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382411 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382504 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382561 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382697 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382749 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.382910 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.383556 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.397345 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.397552 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.397647 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.384245 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.384398 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.384620 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.384878 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385023 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.397759 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385075 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385153 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385336 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385376 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385468 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385496 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385793 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385834 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385947 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385190 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.386189 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.386380 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.387316 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.387566 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:24:32.887548935 +0000 UTC m=+21.123847495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.397933 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:32.897923165 +0000 UTC m=+21.134221725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.398368 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.398367 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.388202 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.388258 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.388749 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.389042 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.389103 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.389127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.389651 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.398434 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.389689 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.389699 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.389808 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.389926 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.390033 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.390620 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.390888 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.390903 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.390901 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.390917 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.390926 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.390923 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.391070 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.391105 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.391347 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.391390 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.391439 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.391596 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.391729 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.391994 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392125 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392136 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392169 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392193 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392220 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392242 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392331 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392502 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392652 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.393483 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.393571 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.393594 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.393667 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392663 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.393760 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.392757 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.393855 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.393876 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.393240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394038 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394044 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394075 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394114 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394177 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394317 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394500 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394665 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394706 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394916 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.393170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.394979 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.395022 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.385048 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.399515 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.400107 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.400150 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.396560 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.381897 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.400328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.400319 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.400396 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.400480 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.400582 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.400589 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.400958 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.401026 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:32.901004701 +0000 UTC m=+21.137303361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.401070 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.401119 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.401524 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.401594 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.401694 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.401873 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.401996 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.402144 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.402188 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.402273 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.402585 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.402764 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.402848 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.403114 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.403172 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.403226 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.403391 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.403462 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.403900 4773 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.403901 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.403970 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.404346 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.404353 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.404582 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.404773 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.404907 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.404932 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.405238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.405405 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.405557 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.405784 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.405914 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.406143 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.406162 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.406371 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.406608 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.406894 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.406995 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.407287 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.407671 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.408041 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.387809 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.408530 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.409148 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.409503 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.409952 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.410434 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.411188 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.411223 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.412060 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.412317 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.412870 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.412875 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.414523 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.421448 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.421529 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.421660 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.421822 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.422029 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.422316 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.423423 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.428151 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.428848 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.428935 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.428998 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.429096 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:32.929078416 +0000 UTC m=+21.165376976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.429525 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.432466 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.432564 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.432633 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.432737 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:32.932708358 +0000 UTC m=+21.169006918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.434479 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.437342 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.437745 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.438171 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.438865 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.448892 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.454429 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.472850 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.476131 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.480391 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.480481 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.480603 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.480649 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.483560 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.484176 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.484786 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.484895 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.484987 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485055 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485107 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485161 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485217 4773 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485273 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485327 4773 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485381 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485434 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485487 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485536 4773 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485588 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485641 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485694 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485762 4773 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485814 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485865 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485928 4773 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.485986 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486039 4773 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486081 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486098 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486054 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486206 4773 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486262 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486314 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486369 4773 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486419 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486472 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486529 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.486579 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.489932 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.489954 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.489970 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.489983 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.489992 4773 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490000 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490009 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490022 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490030 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490038 4773 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490047 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490057 4773 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490065 4773 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490073 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490084 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490091 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490099 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490107 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490117 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490125 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490133 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490141 4773 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490150 4773 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490158 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490166 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490177 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490185 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490192 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490200 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490210 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490217 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490225 4773 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490233 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490243 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490258 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490266 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490274 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490284 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490292 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490300 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490311 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490322 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490331 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490340 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490350 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490358 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490366 4773 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490374 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490386 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490394 4773 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490401 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490409 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490420 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490427 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490435 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490446 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490454 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490462 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490470 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490481 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490489 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490497 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490504 4773 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490514 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490523 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490530 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490540 4773 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490549 4773 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490557 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490564 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490574 4773 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490582 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490590 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490598 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490609 4773 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490617 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490625 4773 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490633 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490643 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490651 4773 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490660 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490670 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490678 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490686 4773 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490694 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490704 4773 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490725 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490734 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490743 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490753 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490761 4773 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490769 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490780 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490787 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490795 4773 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490803 4773 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490813 4773 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490822 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490830 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490838 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490849 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490857 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490866 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490875 4773 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490886 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490894 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490902 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490913 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490921 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490930 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490938 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490950 4773 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490958 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490965 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.490973 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491006 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491017 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491025 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491036 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491044 4773 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491052 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491060 4773 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491070 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491078 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491086 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491094 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491104 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491112 4773 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491120 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491128 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491138 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491147 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491155 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491166 4773 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491174 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491182 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491190 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491200 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491209 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491217 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491225 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491235 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491242 4773 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491251 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491259 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491268 4773 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491276 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491284 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491295 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491303 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491311 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491320 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491331 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491340 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491348 4773 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491356 4773 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491366 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491374 4773 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491382 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491393 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491400 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491408 4773 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.491416 4773 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.492286 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.492849 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.493947 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.494652 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.495224 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.499980 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.500126 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.500770 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.501233 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.502282 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.502746 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.505044 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.505622 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.506587 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.507314 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.507784 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.508831 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.509412 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.509914 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.510951 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.511431 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.512458 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.512979 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.515027 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.516382 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.517100 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.517619 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.518641 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.519193 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.520960 4773 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.521124 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.521130 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.524950 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.526181 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.526616 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.526916 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.528637 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.530692 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.531318 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.532365 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.532385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.535507 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.536281 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.537055 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.538778 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.541378 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.544987 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.547303 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.548222 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.549565 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.550642 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.551683 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.554045 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.554877 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.558816 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.559479 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.560255 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.561480 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.563277 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.564530 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.570107 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: W1012 20:24:32.571837 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6414dc81663d1b53dc730299357292b0bff1122b9f2154092a8a73b419fcaeed WatchSource:0}: Error finding container 6414dc81663d1b53dc730299357292b0bff1122b9f2154092a8a73b419fcaeed: Status 404 returned error can't find the container with id 6414dc81663d1b53dc730299357292b0bff1122b9f2154092a8a73b419fcaeed Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.597561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f33144d715188137f4fc89ad476fe26c50c57369145a89a95471567c39fa0b56"} Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.598082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"69f8aaa581146858b5726bf7b3f507f64ae59f8189f68c71a533f58d6e645e6b"} Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.599094 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6414dc81663d1b53dc730299357292b0bff1122b9f2154092a8a73b419fcaeed"} Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.609878 4773 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.898237 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.898310 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.898419 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.898464 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:33.89845057 +0000 UTC m=+22.134749130 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.898530 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:24:33.898498201 +0000 UTC m=+22.134796761 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.906772 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.911899 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.931047 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.944995 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.966457 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.983965 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.995868 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.999476 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.999509 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:32 crc kubenswrapper[4773]: I1012 20:24:32.999537 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999621 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999630 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999648 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999659 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999674 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:33.99966348 +0000 UTC m=+22.235962040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999688 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:33.99968029 +0000 UTC m=+22.235978850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999746 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999774 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999785 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:32 crc kubenswrapper[4773]: E1012 20:24:32.999843 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:33.999826084 +0000 UTC m=+22.236124644 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.005641 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.007647 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.016466 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.031774 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.041000 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.051361 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.060434 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.069700 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.077706 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.086789 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.094924 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.480020 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:33 crc kubenswrapper[4773]: E1012 20:24:33.480136 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.545874 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6sl6z"] Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.546209 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6sl6z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.552026 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.552039 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.553323 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.560943 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.573407 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.577112 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.586825 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.598788 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.602316 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce"} Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.602355 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee"} Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.603307 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f"} Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.614551 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.631262 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.646365 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.659482 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.693052 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.705412 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5b8397b6-40c2-4993-a5c7-a2afe63667ca-hosts-file\") pod \"node-resolver-6sl6z\" (UID: \"5b8397b6-40c2-4993-a5c7-a2afe63667ca\") " pod="openshift-dns/node-resolver-6sl6z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.705469 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrfzh\" (UniqueName: \"kubernetes.io/projected/5b8397b6-40c2-4993-a5c7-a2afe63667ca-kube-api-access-wrfzh\") pod \"node-resolver-6sl6z\" (UID: \"5b8397b6-40c2-4993-a5c7-a2afe63667ca\") " pod="openshift-dns/node-resolver-6sl6z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.711864 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.725912 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.746037 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.774185 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.791937 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.806646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5b8397b6-40c2-4993-a5c7-a2afe63667ca-hosts-file\") pod \"node-resolver-6sl6z\" (UID: \"5b8397b6-40c2-4993-a5c7-a2afe63667ca\") " pod="openshift-dns/node-resolver-6sl6z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.806692 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrfzh\" (UniqueName: \"kubernetes.io/projected/5b8397b6-40c2-4993-a5c7-a2afe63667ca-kube-api-access-wrfzh\") pod \"node-resolver-6sl6z\" (UID: \"5b8397b6-40c2-4993-a5c7-a2afe63667ca\") " pod="openshift-dns/node-resolver-6sl6z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.806990 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5b8397b6-40c2-4993-a5c7-a2afe63667ca-hosts-file\") pod \"node-resolver-6sl6z\" (UID: \"5b8397b6-40c2-4993-a5c7-a2afe63667ca\") " pod="openshift-dns/node-resolver-6sl6z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.820456 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.826775 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrfzh\" (UniqueName: \"kubernetes.io/projected/5b8397b6-40c2-4993-a5c7-a2afe63667ca-kube-api-access-wrfzh\") pod \"node-resolver-6sl6z\" (UID: \"5b8397b6-40c2-4993-a5c7-a2afe63667ca\") " pod="openshift-dns/node-resolver-6sl6z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.853553 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.859156 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6sl6z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.867045 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.893705 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.907854 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.907925 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:33 crc kubenswrapper[4773]: E1012 20:24:33.907999 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:24:35.907984658 +0000 UTC m=+24.144283218 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:24:33 crc kubenswrapper[4773]: E1012 20:24:33.908049 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:33 crc kubenswrapper[4773]: E1012 20:24:33.908084 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:35.90807813 +0000 UTC m=+24.144376690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.908046 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.919151 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.929893 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.958369 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-cbx9j"] Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.958680 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.959183 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-67c6h"] Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.959382 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-67c6h" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.960980 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.961162 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.961566 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.962481 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.962639 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.962782 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.963030 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.963180 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.963609 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.967605 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jdcn7"] Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.968278 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.969488 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.970899 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.973632 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.980150 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:33 crc kubenswrapper[4773]: I1012 20:24:33.993269 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:33Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.004917 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.008345 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.008406 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.008434 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008541 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008562 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008563 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008573 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008585 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008599 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008603 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008630 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:36.008612981 +0000 UTC m=+24.244911541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008694 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:36.008674313 +0000 UTC m=+24.244972873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.008742 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:36.008707004 +0000 UTC m=+24.245005574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.023104 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.033265 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.042928 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.054379 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.066478 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.078078 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.089568 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.100739 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-hostroot\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109668 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-socket-dir-parent\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109684 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-run-multus-certs\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109701 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2d85a10-4066-430e-ac9f-533080da69f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109751 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-var-lib-cni-bin\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109765 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-cnibin\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109780 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-var-lib-kubelet\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109794 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-etc-kubernetes\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109823 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-mcd-auth-proxy-config\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109890 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-run-k8s-cni-cncf-io\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.109941 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9pc\" (UniqueName: \"kubernetes.io/projected/69ad9308-d890-40f4-9b73-fb4aad78ccd1-kube-api-access-pm9pc\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110007 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g7lj\" (UniqueName: \"kubernetes.io/projected/b2d85a10-4066-430e-ac9f-533080da69f7-kube-api-access-7g7lj\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-proxy-tls\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b2d85a10-4066-430e-ac9f-533080da69f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110093 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-var-lib-cni-multus\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110116 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-run-netns\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110137 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-system-cni-dir\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110157 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-os-release\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110179 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-system-cni-dir\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-conf-dir\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110283 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-daemon-config\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110315 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-rootfs\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110332 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/69ad9308-d890-40f4-9b73-fb4aad78ccd1-cni-binary-copy\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110348 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-cni-dir\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110376 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-cnibin\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110402 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-os-release\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.110467 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmg4\" (UniqueName: \"kubernetes.io/projected/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-kube-api-access-7lmg4\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.114015 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.126318 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.143372 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.159695 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.172679 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.183164 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.193046 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.204900 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211474 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-var-lib-cni-bin\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211513 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-cnibin\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211531 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-var-lib-kubelet\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211545 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-etc-kubernetes\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211559 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-mcd-auth-proxy-config\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211573 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9pc\" (UniqueName: \"kubernetes.io/projected/69ad9308-d890-40f4-9b73-fb4aad78ccd1-kube-api-access-pm9pc\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211595 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-run-k8s-cni-cncf-io\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211617 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g7lj\" (UniqueName: \"kubernetes.io/projected/b2d85a10-4066-430e-ac9f-533080da69f7-kube-api-access-7g7lj\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211631 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-proxy-tls\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211645 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b2d85a10-4066-430e-ac9f-533080da69f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211675 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-var-lib-cni-multus\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211690 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-run-netns\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-system-cni-dir\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211736 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-os-release\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-system-cni-dir\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211779 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-daemon-config\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211802 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-rootfs\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211818 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/69ad9308-d890-40f4-9b73-fb4aad78ccd1-cni-binary-copy\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211833 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-conf-dir\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-cnibin\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211868 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-os-release\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211884 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-cni-dir\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211901 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmg4\" (UniqueName: \"kubernetes.io/projected/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-kube-api-access-7lmg4\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211915 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-hostroot\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211937 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-run-multus-certs\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211953 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2d85a10-4066-430e-ac9f-533080da69f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.211982 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-socket-dir-parent\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.212051 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-socket-dir-parent\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.212089 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-var-lib-cni-bin\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.212117 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-cnibin\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.212140 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-var-lib-kubelet\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.212160 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-etc-kubernetes\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.212809 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-mcd-auth-proxy-config\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.212855 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-rootfs\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.212887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-run-k8s-cni-cncf-io\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.212997 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-conf-dir\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213009 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-cni-dir\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213104 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-cnibin\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213220 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-hostroot\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213262 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-os-release\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-run-multus-certs\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213463 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-system-cni-dir\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213512 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-var-lib-cni-multus\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213555 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-os-release\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213625 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/69ad9308-d890-40f4-9b73-fb4aad78ccd1-cni-binary-copy\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213640 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-system-cni-dir\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.213747 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69ad9308-d890-40f4-9b73-fb4aad78ccd1-host-run-netns\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.214172 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2d85a10-4066-430e-ac9f-533080da69f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.214234 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b2d85a10-4066-430e-ac9f-533080da69f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.214762 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2d85a10-4066-430e-ac9f-533080da69f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.215150 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/69ad9308-d890-40f4-9b73-fb4aad78ccd1-multus-daemon-config\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.218901 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.221072 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-proxy-tls\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.229759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9pc\" (UniqueName: \"kubernetes.io/projected/69ad9308-d890-40f4-9b73-fb4aad78ccd1-kube-api-access-pm9pc\") pod \"multus-67c6h\" (UID: \"69ad9308-d890-40f4-9b73-fb4aad78ccd1\") " pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.230015 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g7lj\" (UniqueName: \"kubernetes.io/projected/b2d85a10-4066-430e-ac9f-533080da69f7-kube-api-access-7g7lj\") pod \"multus-additional-cni-plugins-jdcn7\" (UID: \"b2d85a10-4066-430e-ac9f-533080da69f7\") " pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.231205 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmg4\" (UniqueName: \"kubernetes.io/projected/c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f-kube-api-access-7lmg4\") pod \"machine-config-daemon-cbx9j\" (UID: \"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\") " pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.239009 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.250577 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.262108 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.273075 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.277283 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.282476 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-67c6h" Oct 12 20:24:34 crc kubenswrapper[4773]: W1012 20:24:34.284896 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4659ccb_e7e6_4c79_9f0b_5e8c3c2aad4f.slice/crio-013d29995999c401003b330c3c32dab2eb5997844e48e673e68b5b36d8a8c5f2 WatchSource:0}: Error finding container 013d29995999c401003b330c3c32dab2eb5997844e48e673e68b5b36d8a8c5f2: Status 404 returned error can't find the container with id 013d29995999c401003b330c3c32dab2eb5997844e48e673e68b5b36d8a8c5f2 Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.288473 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.353803 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tzm6q"] Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.354551 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.359087 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.359269 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.359368 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.359426 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.359552 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.359620 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.359378 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.385283 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.412943 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.427777 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.455751 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.471447 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.480578 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.480674 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.480982 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.481039 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.492196 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.511675 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515222 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-env-overrides\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515251 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-bin\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515268 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-openvswitch\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-systemd-units\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515306 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-log-socket\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515327 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-script-lib\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515343 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-kubelet\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515358 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-slash\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515371 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-config\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515384 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnldc\" (UniqueName: \"kubernetes.io/projected/9bd89b89-9347-4b0d-8861-4ff26c9640b5-kube-api-access-wnldc\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515402 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-netns\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515433 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-systemd\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-netd\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515463 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515496 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-etc-openvswitch\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515511 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-ovn\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515526 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-var-lib-openvswitch\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-node-log\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.515553 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovn-node-metrics-cert\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.533669 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.556291 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.583038 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.604814 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.607605 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.607636 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.607645 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"013d29995999c401003b330c3c32dab2eb5997844e48e673e68b5b36d8a8c5f2"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.609225 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6sl6z" event={"ID":"5b8397b6-40c2-4993-a5c7-a2afe63667ca","Type":"ContainerStarted","Data":"9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.609245 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6sl6z" event={"ID":"5b8397b6-40c2-4993-a5c7-a2afe63667ca","Type":"ContainerStarted","Data":"aa8ec3a4ae7b3b60974408f0b6835a33b979bc28314142ad5dd3cc2eca496c28"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.610557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.612028 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" event={"ID":"b2d85a10-4066-430e-ac9f-533080da69f7","Type":"ContainerStarted","Data":"3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.612050 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" event={"ID":"b2d85a10-4066-430e-ac9f-533080da69f7","Type":"ContainerStarted","Data":"50cf6251fbf47fb42ea60ad83a4386a6063e52bdf8585a574c39faa92f0847ad"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.613590 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67c6h" event={"ID":"69ad9308-d890-40f4-9b73-fb4aad78ccd1","Type":"ContainerStarted","Data":"249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.613613 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67c6h" event={"ID":"69ad9308-d890-40f4-9b73-fb4aad78ccd1","Type":"ContainerStarted","Data":"1308da9e21fbe75bdf2db7fbf50d00e1996efc5e90f256bebd39efe12c42eabb"} Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616066 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-netns\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-systemd\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616124 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-netd\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616145 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616168 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616194 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-etc-openvswitch\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616255 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-ovn\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616278 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-var-lib-openvswitch\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616297 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-node-log\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616316 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovn-node-metrics-cert\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616335 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-env-overrides\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616353 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-bin\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616371 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-systemd-units\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616391 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-openvswitch\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616423 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-log-socket\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-script-lib\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616474 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-kubelet\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616491 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-slash\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616511 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-config\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616530 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnldc\" (UniqueName: \"kubernetes.io/projected/9bd89b89-9347-4b0d-8861-4ff26c9640b5-kube-api-access-wnldc\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616786 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-netns\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616850 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-systemd\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616879 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-netd\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616900 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616919 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616939 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-etc-openvswitch\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616959 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-ovn\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616979 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-var-lib-openvswitch\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.616996 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-node-log\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.617985 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-env-overrides\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.618027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-bin\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.618050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-systemd-units\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.618073 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-openvswitch\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.618093 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-log-socket\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.618497 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-script-lib\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.618535 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-kubelet\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.618555 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-slash\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.619150 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-config\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.619435 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.620049 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovn-node-metrics-cert\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: E1012 20:24:34.627550 4773 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.638849 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnldc\" (UniqueName: \"kubernetes.io/projected/9bd89b89-9347-4b0d-8861-4ff26c9640b5-kube-api-access-wnldc\") pod \"ovnkube-node-tzm6q\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.640653 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.654409 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.667451 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.670541 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.682499 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: W1012 20:24:34.682636 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd89b89_9347_4b0d_8861_4ff26c9640b5.slice/crio-7c19adea7522cbf456c1874092f7c4c3a34f13a4c3c3fba58192f3d62ce50ab6 WatchSource:0}: Error finding container 7c19adea7522cbf456c1874092f7c4c3a34f13a4c3c3fba58192f3d62ce50ab6: Status 404 returned error can't find the container with id 7c19adea7522cbf456c1874092f7c4c3a34f13a4c3c3fba58192f3d62ce50ab6 Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.702663 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.738510 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.779339 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.818309 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.863053 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.905990 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.943326 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:34 crc kubenswrapper[4773]: I1012 20:24:34.987987 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:34Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.018157 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.058856 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.099912 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.143061 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.480283 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:35 crc kubenswrapper[4773]: E1012 20:24:35.480389 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.620827 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2d85a10-4066-430e-ac9f-533080da69f7" containerID="3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b" exitCode=0 Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.620913 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" event={"ID":"b2d85a10-4066-430e-ac9f-533080da69f7","Type":"ContainerDied","Data":"3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b"} Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.623373 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc" exitCode=0 Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.623412 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc"} Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.623437 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"7c19adea7522cbf456c1874092f7c4c3a34f13a4c3c3fba58192f3d62ce50ab6"} Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.642810 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.665424 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.679203 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.696431 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.708663 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.718505 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.730118 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.741474 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.753739 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.766803 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.779635 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.790399 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.802174 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.818803 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.827304 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.840541 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.851928 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-h94p2"] Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.852320 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.854185 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.854767 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.862975 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.870327 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.889822 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.938108 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.938205 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.938230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34dec32f-3ec7-4897-bb63-9f8e018fe743-host\") pod \"node-ca-h94p2\" (UID: \"34dec32f-3ec7-4897-bb63-9f8e018fe743\") " pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:35 crc kubenswrapper[4773]: E1012 20:24:35.938281 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:35 crc kubenswrapper[4773]: E1012 20:24:35.938295 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:24:39.938267607 +0000 UTC m=+28.174566167 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.938376 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/34dec32f-3ec7-4897-bb63-9f8e018fe743-serviceca\") pod \"node-ca-h94p2\" (UID: \"34dec32f-3ec7-4897-bb63-9f8e018fe743\") " pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.938460 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8p8p\" (UniqueName: \"kubernetes.io/projected/34dec32f-3ec7-4897-bb63-9f8e018fe743-kube-api-access-x8p8p\") pod \"node-ca-h94p2\" (UID: \"34dec32f-3ec7-4897-bb63-9f8e018fe743\") " pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:35 crc kubenswrapper[4773]: E1012 20:24:35.938532 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:39.938515724 +0000 UTC m=+28.174814284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.939963 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:35 crc kubenswrapper[4773]: I1012 20:24:35.980813 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:35Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.021252 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.039708 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.039777 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8p8p\" (UniqueName: \"kubernetes.io/projected/34dec32f-3ec7-4897-bb63-9f8e018fe743-kube-api-access-x8p8p\") pod \"node-ca-h94p2\" (UID: \"34dec32f-3ec7-4897-bb63-9f8e018fe743\") " pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.039803 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.039831 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34dec32f-3ec7-4897-bb63-9f8e018fe743-host\") pod \"node-ca-h94p2\" (UID: \"34dec32f-3ec7-4897-bb63-9f8e018fe743\") " pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.040016 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.039955 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.040081 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.040110 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.040173 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:40.040152976 +0000 UTC m=+28.276451546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.040038 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/34dec32f-3ec7-4897-bb63-9f8e018fe743-serviceca\") pod \"node-ca-h94p2\" (UID: \"34dec32f-3ec7-4897-bb63-9f8e018fe743\") " pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.039977 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.040552 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.040631 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:40.040610309 +0000 UTC m=+28.276908879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.039981 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34dec32f-3ec7-4897-bb63-9f8e018fe743-host\") pod \"node-ca-h94p2\" (UID: \"34dec32f-3ec7-4897-bb63-9f8e018fe743\") " pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.040561 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.040704 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.040752 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:40.040743673 +0000 UTC m=+28.277042243 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.041351 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/34dec32f-3ec7-4897-bb63-9f8e018fe743-serviceca\") pod \"node-ca-h94p2\" (UID: \"34dec32f-3ec7-4897-bb63-9f8e018fe743\") " pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.063142 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.090995 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8p8p\" (UniqueName: \"kubernetes.io/projected/34dec32f-3ec7-4897-bb63-9f8e018fe743-kube-api-access-x8p8p\") pod \"node-ca-h94p2\" (UID: \"34dec32f-3ec7-4897-bb63-9f8e018fe743\") " pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.130338 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.164255 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.169806 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h94p2" Oct 12 20:24:36 crc kubenswrapper[4773]: W1012 20:24:36.184817 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34dec32f_3ec7_4897_bb63_9f8e018fe743.slice/crio-df6c0153cdf2d0d242b9d20bd2a41a3104db957d92445c4ce2b300129aa77f77 WatchSource:0}: Error finding container df6c0153cdf2d0d242b9d20bd2a41a3104db957d92445c4ce2b300129aa77f77: Status 404 returned error can't find the container with id df6c0153cdf2d0d242b9d20bd2a41a3104db957d92445c4ce2b300129aa77f77 Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.209346 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.256229 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.284196 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.320819 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.366654 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.398506 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.437723 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.478907 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.480015 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.480150 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.480278 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:36 crc kubenswrapper[4773]: E1012 20:24:36.480400 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.519007 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.559661 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.604333 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.628162 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2d85a10-4066-430e-ac9f-533080da69f7" containerID="2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa" exitCode=0 Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.628231 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" event={"ID":"b2d85a10-4066-430e-ac9f-533080da69f7","Type":"ContainerDied","Data":"2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa"} Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.632761 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.632810 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.632820 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.632830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.632838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.632847 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.634183 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h94p2" event={"ID":"34dec32f-3ec7-4897-bb63-9f8e018fe743","Type":"ContainerStarted","Data":"e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3"} Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.634230 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h94p2" event={"ID":"34dec32f-3ec7-4897-bb63-9f8e018fe743","Type":"ContainerStarted","Data":"df6c0153cdf2d0d242b9d20bd2a41a3104db957d92445c4ce2b300129aa77f77"} Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.647598 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.681012 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.719149 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.758477 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.802473 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.839110 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.878190 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.917193 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.956230 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:36 crc kubenswrapper[4773]: I1012 20:24:36.999756 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.039991 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.081089 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.120764 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.164823 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.214593 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.249481 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.282273 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.323096 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.363927 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.409545 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.440128 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.478993 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.480079 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:37 crc kubenswrapper[4773]: E1012 20:24:37.480191 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.501475 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.518341 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.559249 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.614576 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.640195 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2d85a10-4066-430e-ac9f-533080da69f7" containerID="f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778" exitCode=0 Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.640234 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" event={"ID":"b2d85a10-4066-430e-ac9f-533080da69f7","Type":"ContainerDied","Data":"f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778"} Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.654805 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.683852 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.721446 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.766308 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.797351 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.839830 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.877510 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.916569 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:37 crc kubenswrapper[4773]: I1012 20:24:37.959967 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.006316 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.043413 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.083668 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.129691 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.168753 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.200503 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.240593 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.282234 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.321317 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.363080 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.403040 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.444294 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.480479 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.480479 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:38 crc kubenswrapper[4773]: E1012 20:24:38.480679 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:38 crc kubenswrapper[4773]: E1012 20:24:38.480775 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.488699 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.533450 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.567376 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.603438 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.644395 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.646679 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2d85a10-4066-430e-ac9f-533080da69f7" containerID="ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776" exitCode=0 Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.646861 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" event={"ID":"b2d85a10-4066-430e-ac9f-533080da69f7","Type":"ContainerDied","Data":"ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776"} Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.651250 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.680294 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.685470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.685530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.685551 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.685772 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.687880 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.732000 4773 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.732253 4773 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.733445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.733489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.733503 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.733523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.733536 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:38Z","lastTransitionTime":"2025-10-12T20:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:38 crc kubenswrapper[4773]: E1012 20:24:38.746533 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.749837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.749876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.749888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.749905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.749915 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:38Z","lastTransitionTime":"2025-10-12T20:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.761098 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: E1012 20:24:38.764040 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.767464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.767506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.767518 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.767535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.767546 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:38Z","lastTransitionTime":"2025-10-12T20:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:38 crc kubenswrapper[4773]: E1012 20:24:38.784702 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.790298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.790328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.790339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.790354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.790365 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:38Z","lastTransitionTime":"2025-10-12T20:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.803983 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: E1012 20:24:38.809443 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.812761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.812788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.812799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.812816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.812827 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:38Z","lastTransitionTime":"2025-10-12T20:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:38 crc kubenswrapper[4773]: E1012 20:24:38.836230 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: E1012 20:24:38.836540 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.838291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.838311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.838318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.838332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.838343 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:38Z","lastTransitionTime":"2025-10-12T20:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.841467 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.880819 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.920001 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.941284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.941435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.941515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.941596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.941679 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:38Z","lastTransitionTime":"2025-10-12T20:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:38 crc kubenswrapper[4773]: I1012 20:24:38.959069 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.000153 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.044756 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.045213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.045236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.045246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.045264 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.045275 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.084419 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.118908 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.151888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.151916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.151924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.151936 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.151944 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.159282 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.203193 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.242088 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.254768 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.254800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.254812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.254829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.254840 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.277959 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.319126 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.358063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.358111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.358129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.358124 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.358154 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.358197 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.423050 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.461071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.461117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.461133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.461162 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.461183 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.480457 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:39 crc kubenswrapper[4773]: E1012 20:24:39.480577 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.563577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.563640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.563655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.563700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.563730 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.658599 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2d85a10-4066-430e-ac9f-533080da69f7" containerID="0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba" exitCode=0 Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.658639 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" event={"ID":"b2d85a10-4066-430e-ac9f-533080da69f7","Type":"ContainerDied","Data":"0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.669950 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.669991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.670003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.670020 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.670033 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.677086 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.706931 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.721943 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.738503 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.756323 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.768439 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.772858 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.772887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.772899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.772917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.772929 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.782451 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.794112 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.804501 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.815841 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.840883 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.875924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.875971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.875982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.876001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.876013 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.880148 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.918563 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.961562 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.978055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.978089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.978118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.978137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.978150 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:39Z","lastTransitionTime":"2025-10-12T20:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.982958 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:39 crc kubenswrapper[4773]: I1012 20:24:39.983086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:39 crc kubenswrapper[4773]: E1012 20:24:39.983124 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:24:47.983108034 +0000 UTC m=+36.219406594 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:24:39 crc kubenswrapper[4773]: E1012 20:24:39.983173 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:39 crc kubenswrapper[4773]: E1012 20:24:39.983219 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:47.983208327 +0000 UTC m=+36.219506907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.003633 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.084707 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.084923 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.085000 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.085459 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.085505 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.085526 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.085610 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:48.085586089 +0000 UTC m=+36.321884689 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.085465 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.085948 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.086077 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.086261 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:48.086220287 +0000 UTC m=+36.322518887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.086409 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.086494 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:48.086469764 +0000 UTC m=+36.322768364 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.086768 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.086807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.086820 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.086838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.086859 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:40Z","lastTransitionTime":"2025-10-12T20:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.189453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.189662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.189757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.189865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.189940 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:40Z","lastTransitionTime":"2025-10-12T20:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.292495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.292706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.292820 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.292897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.292967 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:40Z","lastTransitionTime":"2025-10-12T20:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.396524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.396574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.396591 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.396615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.396632 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:40Z","lastTransitionTime":"2025-10-12T20:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.480120 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.480304 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.480669 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:40 crc kubenswrapper[4773]: E1012 20:24:40.480890 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.500077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.500130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.500147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.500170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.500188 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:40Z","lastTransitionTime":"2025-10-12T20:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.603422 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.603462 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.603473 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.603487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.603496 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:40Z","lastTransitionTime":"2025-10-12T20:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.668481 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2d85a10-4066-430e-ac9f-533080da69f7" containerID="596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4" exitCode=0 Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.668538 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" event={"ID":"b2d85a10-4066-430e-ac9f-533080da69f7","Type":"ContainerDied","Data":"596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4"} Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.690475 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.709299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.709349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.709360 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.709377 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.709390 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:40Z","lastTransitionTime":"2025-10-12T20:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.714084 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.751490 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.772431 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.787453 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.802344 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.814660 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.815658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.815683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.815691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.815706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.815731 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:40Z","lastTransitionTime":"2025-10-12T20:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.824320 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.833957 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.845172 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.854701 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.866953 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.880575 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.895903 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.912995 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:40Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.918202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.918237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.918249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.918263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:40 crc kubenswrapper[4773]: I1012 20:24:40.918273 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:40Z","lastTransitionTime":"2025-10-12T20:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.020640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.020677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.020688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.020748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.020760 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.123557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.123602 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.123617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.123633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.123646 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.230309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.230350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.230366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.230387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.230431 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.333513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.333557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.333567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.333581 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.333590 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.436441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.436479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.436490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.436506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.436517 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.480811 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:41 crc kubenswrapper[4773]: E1012 20:24:41.480942 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.540188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.540248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.540261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.540277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.540314 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.642957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.643001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.643012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.643031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.643046 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.676386 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" event={"ID":"b2d85a10-4066-430e-ac9f-533080da69f7","Type":"ContainerStarted","Data":"1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.683483 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.683804 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.710844 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.718053 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.726881 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.738889 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.745090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.745126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.745135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.745149 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.745159 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.753497 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.767934 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.782102 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.798921 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.811096 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.820153 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.834674 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.847355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.847547 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.847645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.847752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.847875 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.848926 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.861997 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.876706 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.891616 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.908566 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.919954 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.938398 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.950300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.950326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.950334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.950347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.950354 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:41Z","lastTransitionTime":"2025-10-12T20:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.971437 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.986606 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:41 crc kubenswrapper[4773]: I1012 20:24:41.998648 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.017973 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.038237 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.051614 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.052812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.052872 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.052890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.052913 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.052930 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.067107 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.079202 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.088170 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.100867 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.113787 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.129831 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.139403 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.155173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.155202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.155210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.155224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.155233 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.257017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.257043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.257054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.257069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.257080 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.359217 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.359249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.359257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.359271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.359280 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.461700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.461787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.461804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.461830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.461847 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.480864 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.480932 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:42 crc kubenswrapper[4773]: E1012 20:24:42.481019 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:42 crc kubenswrapper[4773]: E1012 20:24:42.481133 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.499341 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.526152 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.542571 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.564258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.564314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.564333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.564357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.564375 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.566646 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.591200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.605043 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.626473 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.645935 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.660673 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.667290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.667422 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.667545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.667679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.667834 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.676923 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.686093 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.686745 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.690809 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.704588 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.716950 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.719348 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.733884 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.746614 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.762282 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.770617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.770678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.770689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.770707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.770722 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.785797 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.809231 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.825910 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.841250 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.859386 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.873215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.873270 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.873292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.873340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.873358 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.879333 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.896445 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.909059 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.923456 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.934672 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.954430 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.970525 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.975622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.975678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.975697 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.975725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.975767 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:42Z","lastTransitionTime":"2025-10-12T20:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:42 crc kubenswrapper[4773]: I1012 20:24:42.987192 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.014496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:43Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.078060 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.078099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.078114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.078146 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.078156 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:43Z","lastTransitionTime":"2025-10-12T20:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.180664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.180746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.180765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.180791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.180806 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:43Z","lastTransitionTime":"2025-10-12T20:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.283246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.283288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.283302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.283319 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.283331 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:43Z","lastTransitionTime":"2025-10-12T20:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.386481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.386532 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.386544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.386565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.386578 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:43Z","lastTransitionTime":"2025-10-12T20:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.480210 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:43 crc kubenswrapper[4773]: E1012 20:24:43.480396 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.489179 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.489213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.489221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.489236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.489245 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:43Z","lastTransitionTime":"2025-10-12T20:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.592129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.592188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.592206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.592231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.592249 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:43Z","lastTransitionTime":"2025-10-12T20:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.691045 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.694706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.694840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.694861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.694930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.694955 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:43Z","lastTransitionTime":"2025-10-12T20:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.803423 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.803468 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.803479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.803497 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.803508 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:43Z","lastTransitionTime":"2025-10-12T20:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.905700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.905753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.905765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.905781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:43 crc kubenswrapper[4773]: I1012 20:24:43.905793 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:43Z","lastTransitionTime":"2025-10-12T20:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.009177 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.009292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.009303 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.009318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.009328 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.112994 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.113051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.113067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.113091 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.113110 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.216450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.216509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.216532 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.216561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.216585 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.320682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.320788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.320812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.320840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.320860 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.423493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.423541 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.423557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.423579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.423598 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.481530 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.481656 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:44 crc kubenswrapper[4773]: E1012 20:24:44.481856 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:44 crc kubenswrapper[4773]: E1012 20:24:44.482005 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.527290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.527337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.527354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.527380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.527403 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.630037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.630108 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.630131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.630162 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.630184 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.699124 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/0.log" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.703916 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74" exitCode=1 Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.703974 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.705149 4773 scope.go:117] "RemoveContainer" containerID="8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.732326 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.735459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.735500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.735517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.735546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.735564 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.753531 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.776536 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.795412 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.830056 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.837717 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.837779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.837791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.837808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.837820 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.885819 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:43Z\\\",\\\"message\\\":\\\" handler 8 for removal\\\\nI1012 20:24:43.688027 5974 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:43.688033 5974 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:43.688070 5974 factory.go:656] Stopping watch factory\\\\nI1012 20:24:43.688099 5974 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:43.688111 5974 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:43.688119 5974 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:43.688127 5974 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:43.687957 5974 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1012 20:24:43.688179 5974 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1012 20:24:43.688311 5974 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 20:24:43.688321 5974 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:43.688329 5974 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:43.688796 5974 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.906517 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.939874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.939907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.939916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.939931 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.939940 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:44Z","lastTransitionTime":"2025-10-12T20:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.940738 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.952228 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.965577 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.980756 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.989890 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:44 crc kubenswrapper[4773]: I1012 20:24:44.998689 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:44Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.009632 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.019310 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.042198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.042224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.042233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.042247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.042255 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.144166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.144211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.144228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.144250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.144267 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.246381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.246632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.246788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.246927 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.247009 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.349397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.349424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.349432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.349444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.349452 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.452216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.452247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.452255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.452268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.452277 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.481157 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:45 crc kubenswrapper[4773]: E1012 20:24:45.481344 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.554756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.554813 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.554829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.554854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.554874 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.657167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.657225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.657237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.657254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.657264 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.709590 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/1.log" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.710991 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/0.log" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.713598 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40" exitCode=1 Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.713636 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.713687 4773 scope.go:117] "RemoveContainer" containerID="8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.715151 4773 scope.go:117] "RemoveContainer" containerID="b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40" Oct 12 20:24:45 crc kubenswrapper[4773]: E1012 20:24:45.715628 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.730818 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.759533 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:43Z\\\",\\\"message\\\":\\\" handler 8 for removal\\\\nI1012 20:24:43.688027 5974 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:43.688033 5974 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:43.688070 5974 factory.go:656] Stopping watch factory\\\\nI1012 20:24:43.688099 5974 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:43.688111 5974 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:43.688119 5974 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:43.688127 5974 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:43.687957 5974 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1012 20:24:43.688179 5974 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1012 20:24:43.688311 5974 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 20:24:43.688321 5974 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:43.688329 5974 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:43.688796 5974 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.760775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.760828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.760846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.760870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.760889 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.777610 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.798636 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.824499 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.846236 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.862859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.862892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.862903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.862919 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.862930 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.865702 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.880906 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.894692 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.907473 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.926448 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.944125 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.959458 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.965246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.965307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.965326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.965351 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.965369 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:45Z","lastTransitionTime":"2025-10-12T20:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.980553 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:45 crc kubenswrapper[4773]: I1012 20:24:45.998882 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:45Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.074147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.074224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.074244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.074271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.074289 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:46Z","lastTransitionTime":"2025-10-12T20:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.177341 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.177409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.177428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.177452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.177469 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:46Z","lastTransitionTime":"2025-10-12T20:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.280155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.280202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.280214 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.280231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.280244 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:46Z","lastTransitionTime":"2025-10-12T20:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.301518 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj"] Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.302008 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.305433 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.305555 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.330809 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.350522 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.350629 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.350670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.350801 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgcjk\" (UniqueName: \"kubernetes.io/projected/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-kube-api-access-xgcjk\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.351951 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.371114 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.383354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.383411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.383472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.383527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.383548 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:46Z","lastTransitionTime":"2025-10-12T20:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.389063 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.408290 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.427716 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.446123 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.452175 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.452235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.452302 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgcjk\" (UniqueName: \"kubernetes.io/projected/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-kube-api-access-xgcjk\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.452352 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.453085 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.453240 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.461726 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.463273 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.476220 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.477254 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgcjk\" (UniqueName: \"kubernetes.io/projected/7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1-kube-api-access-xgcjk\") pod \"ovnkube-control-plane-749d76644c-rnnqj\" (UID: \"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.480942 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:46 crc kubenswrapper[4773]: E1012 20:24:46.481052 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.480946 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:46 crc kubenswrapper[4773]: E1012 20:24:46.481233 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.485853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.485896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.485912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.485935 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.485952 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:46Z","lastTransitionTime":"2025-10-12T20:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.496034 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.517303 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.535260 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.548455 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.561820 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.580904 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.588109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.588172 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.588191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.588216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.588235 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:46Z","lastTransitionTime":"2025-10-12T20:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.605496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d31b7c22428674830f80c75266a3865db4c8fa36abd8a1e3b957b368bf94c74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:43Z\\\",\\\"message\\\":\\\" handler 8 for removal\\\\nI1012 20:24:43.688027 5974 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:43.688033 5974 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:43.688070 5974 factory.go:656] Stopping watch factory\\\\nI1012 20:24:43.688099 5974 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:43.688111 5974 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:43.688119 5974 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:43.688127 5974 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:43.687957 5974 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1012 20:24:43.688179 5974 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1012 20:24:43.688311 5974 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 20:24:43.688321 5974 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:43.688329 5974 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:43.688796 5974 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.625821 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" Oct 12 20:24:46 crc kubenswrapper[4773]: W1012 20:24:46.644857 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7322c897_b1e2_48d0_a8b9_3c22cc8a4fc1.slice/crio-6778a736a2a7169747bd7d823f742eba1f3262a822a0580448774709e755b21e WatchSource:0}: Error finding container 6778a736a2a7169747bd7d823f742eba1f3262a822a0580448774709e755b21e: Status 404 returned error can't find the container with id 6778a736a2a7169747bd7d823f742eba1f3262a822a0580448774709e755b21e Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.692460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.692506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.692521 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.692544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.692561 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:46Z","lastTransitionTime":"2025-10-12T20:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.718338 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/1.log" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.722168 4773 scope.go:117] "RemoveContainer" containerID="b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40" Oct 12 20:24:46 crc kubenswrapper[4773]: E1012 20:24:46.722391 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.722761 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" event={"ID":"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1","Type":"ContainerStarted","Data":"6778a736a2a7169747bd7d823f742eba1f3262a822a0580448774709e755b21e"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.737020 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.765590 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.778297 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.792134 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.796732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.796814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.796848 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.796878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.796899 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:46Z","lastTransitionTime":"2025-10-12T20:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.816157 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.830538 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.844858 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.856291 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.866214 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.874829 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.884393 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.895651 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.898830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.898861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.898869 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.898883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.898892 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:46Z","lastTransitionTime":"2025-10-12T20:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.905496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.917943 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.927875 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:46 crc kubenswrapper[4773]: I1012 20:24:46.938746 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:46Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.002315 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.002349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.002359 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.002374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.002384 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.104239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.104268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.104277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.104291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.104302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.206569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.206929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.207008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.207044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.207066 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.309530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.309561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.309592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.309608 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.309615 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.412917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.412969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.412980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.412995 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.413004 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.480255 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:47 crc kubenswrapper[4773]: E1012 20:24:47.480366 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.516042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.516073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.516081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.516094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.516103 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.619808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.619872 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.619896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.619926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.619949 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.723030 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.723071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.723085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.723103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.723116 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.728668 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" event={"ID":"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1","Type":"ContainerStarted","Data":"69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.728728 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" event={"ID":"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1","Type":"ContainerStarted","Data":"f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.746794 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.766154 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.781636 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6sbfz"] Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.782476 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:47 crc kubenswrapper[4773]: E1012 20:24:47.782582 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.789414 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.814387 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.826220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.826266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.826281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.826302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.826315 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.832223 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.848702 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.868048 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.868104 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvdh\" (UniqueName: \"kubernetes.io/projected/a0e0fa58-fcd9-4002-a975-a98fcba0f364-kube-api-access-qqvdh\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.871281 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.891029 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.912795 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.929162 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.929200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.929213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.929232 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.929245 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:47Z","lastTransitionTime":"2025-10-12T20:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.946681 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.969653 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.969748 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvdh\" (UniqueName: \"kubernetes.io/projected/a0e0fa58-fcd9-4002-a975-a98fcba0f364-kube-api-access-qqvdh\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:47 crc kubenswrapper[4773]: E1012 20:24:47.969895 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:47 crc kubenswrapper[4773]: E1012 20:24:47.969992 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs podName:a0e0fa58-fcd9-4002-a975-a98fcba0f364 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:48.469965766 +0000 UTC m=+36.706264356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs") pod "network-metrics-daemon-6sbfz" (UID: "a0e0fa58-fcd9-4002-a975-a98fcba0f364") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.976674 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.994819 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:47Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:47 crc kubenswrapper[4773]: I1012 20:24:47.995021 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvdh\" (UniqueName: \"kubernetes.io/projected/a0e0fa58-fcd9-4002-a975-a98fcba0f364-kube-api-access-qqvdh\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.014400 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.027295 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.032222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.032431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.032546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.032793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.032941 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.040555 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.059800 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.071235 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.071393 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.071450 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:25:04.071420492 +0000 UTC m=+52.307719082 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.071536 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.071657 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:04.071625698 +0000 UTC m=+52.307924368 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.076003 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.091479 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.109696 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.126980 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.136451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.136482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.136493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.136514 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.136526 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.144591 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.160323 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.172488 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.172570 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.172605 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.172713 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.172773 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.172792 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.172795 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.172815 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.172815 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.172867 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:04.172842828 +0000 UTC m=+52.409141408 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.172925 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:04.17289841 +0000 UTC m=+52.409197010 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.172831 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.173002 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:04.172984992 +0000 UTC m=+52.409283652 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.179890 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.203961 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.228917 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.239285 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.239319 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.239330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.239355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.239366 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.253064 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.274167 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.290076 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.314317 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.335152 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.341490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.341537 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.341556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.341576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.341591 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.354509 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.369604 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.386545 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:48Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.443716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.443777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.443787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.443805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.443817 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.476077 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.476219 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.476289 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs podName:a0e0fa58-fcd9-4002-a975-a98fcba0f364 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:49.476269202 +0000 UTC m=+37.712567772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs") pod "network-metrics-daemon-6sbfz" (UID: "a0e0fa58-fcd9-4002-a975-a98fcba0f364") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.481039 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.481039 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.481184 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:48 crc kubenswrapper[4773]: E1012 20:24:48.481259 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.546695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.546741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.546752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.546768 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.546783 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.650063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.650107 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.650121 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.650151 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.650171 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.752845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.752940 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.752962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.752986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.753046 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.855888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.855940 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.855959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.855984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.856002 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.961138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.961196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.961231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.961258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:48 crc kubenswrapper[4773]: I1012 20:24:48.961277 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:48Z","lastTransitionTime":"2025-10-12T20:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.065088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.065182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.065201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.065257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.065275 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.168811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.168865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.168882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.168905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.168922 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.200423 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.200508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.200525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.200550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.200566 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.224113 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:49Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.230160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.230223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.230241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.230268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.230285 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.251090 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:49Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.257269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.257339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.257359 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.257386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.257403 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.279511 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:49Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.284448 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.284502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.284523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.284553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.284575 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.306862 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:49Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.312408 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.312496 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.312516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.312541 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.312590 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.334313 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:49Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.334673 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.337168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.337241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.337268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.337298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.337315 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.440878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.440924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.440935 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.440954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.440979 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.480637 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.480666 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.480774 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.480956 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.486219 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.486468 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:49 crc kubenswrapper[4773]: E1012 20:24:49.486552 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs podName:a0e0fa58-fcd9-4002-a975-a98fcba0f364 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:51.48652682 +0000 UTC m=+39.722825420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs") pod "network-metrics-daemon-6sbfz" (UID: "a0e0fa58-fcd9-4002-a975-a98fcba0f364") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.543416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.543479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.543504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.543565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.543590 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.646697 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.646770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.646787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.646810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.646829 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.748658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.748694 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.748706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.748743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.748757 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.852554 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.852605 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.852925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.852945 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.852957 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.956864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.957248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.957376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.957519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:49 crc kubenswrapper[4773]: I1012 20:24:49.957681 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:49Z","lastTransitionTime":"2025-10-12T20:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.060488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.060553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.060567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.060586 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.060619 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.163600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.163677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.163691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.163708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.163735 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.266288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.266368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.266428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.266459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.266481 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.371894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.371961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.371980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.372005 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.372023 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.475663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.475718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.475747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.475764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.475777 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.480426 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.480498 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:50 crc kubenswrapper[4773]: E1012 20:24:50.480613 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:50 crc kubenswrapper[4773]: E1012 20:24:50.480711 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.579809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.579856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.579875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.579900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.579919 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.683948 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.684031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.684057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.684084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.684102 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.787015 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.787378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.787558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.787774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.787922 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.890652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.890689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.890700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.890720 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.890746 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.993654 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.993712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.993761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.993786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:50 crc kubenswrapper[4773]: I1012 20:24:50.993804 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:50Z","lastTransitionTime":"2025-10-12T20:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.096908 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.096959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.096971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.096990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.097003 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:51Z","lastTransitionTime":"2025-10-12T20:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.199581 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.199647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.199664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.199699 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.199713 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:51Z","lastTransitionTime":"2025-10-12T20:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.302630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.302690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.302706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.302757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.302775 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:51Z","lastTransitionTime":"2025-10-12T20:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.405940 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.406018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.406042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.406073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.406095 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:51Z","lastTransitionTime":"2025-10-12T20:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.480885 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:51 crc kubenswrapper[4773]: E1012 20:24:51.481014 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.481177 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:51 crc kubenswrapper[4773]: E1012 20:24:51.481380 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.508194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.508253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.508271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.508294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.508313 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:51Z","lastTransitionTime":"2025-10-12T20:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.510885 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:51 crc kubenswrapper[4773]: E1012 20:24:51.511051 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:51 crc kubenswrapper[4773]: E1012 20:24:51.511152 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs podName:a0e0fa58-fcd9-4002-a975-a98fcba0f364 nodeName:}" failed. No retries permitted until 2025-10-12 20:24:55.511124579 +0000 UTC m=+43.747423179 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs") pod "network-metrics-daemon-6sbfz" (UID: "a0e0fa58-fcd9-4002-a975-a98fcba0f364") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.611470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.611551 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.611571 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.611596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.611617 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:51Z","lastTransitionTime":"2025-10-12T20:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.714458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.714519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.714538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.714563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.714586 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:51Z","lastTransitionTime":"2025-10-12T20:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.818202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.818256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.818274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.818298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.818314 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:51Z","lastTransitionTime":"2025-10-12T20:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.920643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.920826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.920898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.920930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:51 crc kubenswrapper[4773]: I1012 20:24:51.920953 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:51Z","lastTransitionTime":"2025-10-12T20:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.024144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.024238 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.024264 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.024298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.024321 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.128280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.128328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.128343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.128365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.128380 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.232307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.232387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.232411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.232440 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.232462 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.336020 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.336092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.336115 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.336145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.336162 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.439068 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.439441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.439600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.439806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.439982 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.480403 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:52 crc kubenswrapper[4773]: E1012 20:24:52.480539 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.480404 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:52 crc kubenswrapper[4773]: E1012 20:24:52.480930 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.498529 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.529420 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.542830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.542865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.542876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.542892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.542905 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.558499 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.585933 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.606199 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.624481 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.645878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.645926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.645941 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.645961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.645976 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.649096 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.662874 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.681956 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.699821 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.717658 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.734925 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.746870 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.747853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.748039 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.748220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.748382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.748551 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.759655 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.770414 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.780106 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.791973 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.850946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.851006 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.851024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.851047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.851067 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.954321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.954988 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.955014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.955035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:52 crc kubenswrapper[4773]: I1012 20:24:52.955050 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:52Z","lastTransitionTime":"2025-10-12T20:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.061310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.061350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.061368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.061385 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.061396 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.164088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.164180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.164199 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.164229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.164269 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.267138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.267192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.267209 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.267232 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.267248 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.370237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.370292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.370308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.370334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.370351 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.473603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.473685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.473707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.473827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.473855 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.480388 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:53 crc kubenswrapper[4773]: E1012 20:24:53.480546 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.480834 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:53 crc kubenswrapper[4773]: E1012 20:24:53.481125 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.576408 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.576476 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.576494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.576520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.576607 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.679577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.679625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.679641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.679662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.679679 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.782529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.782984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.783148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.783299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.783443 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.886807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.886874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.886898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.886927 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.886949 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.990589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.990659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.990684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.990751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:53 crc kubenswrapper[4773]: I1012 20:24:53.990778 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:53Z","lastTransitionTime":"2025-10-12T20:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.094090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.094228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.094258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.094289 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.094313 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:54Z","lastTransitionTime":"2025-10-12T20:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.197662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.197712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.197771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.197795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.197814 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:54Z","lastTransitionTime":"2025-10-12T20:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.300425 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.300566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.300579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.300594 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.300605 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:54Z","lastTransitionTime":"2025-10-12T20:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.404331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.404391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.404408 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.404431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.404448 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:54Z","lastTransitionTime":"2025-10-12T20:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.480770 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:54 crc kubenswrapper[4773]: E1012 20:24:54.480896 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.480783 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:54 crc kubenswrapper[4773]: E1012 20:24:54.480971 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.506759 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.506790 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.506799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.506814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.506822 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:54Z","lastTransitionTime":"2025-10-12T20:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.609189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.609218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.609226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.609241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.609249 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:54Z","lastTransitionTime":"2025-10-12T20:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.712121 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.712180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.712192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.712210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.712223 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:54Z","lastTransitionTime":"2025-10-12T20:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.814495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.814559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.814581 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.814611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.814636 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:54Z","lastTransitionTime":"2025-10-12T20:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.917454 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.917501 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.917512 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.917524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:54 crc kubenswrapper[4773]: I1012 20:24:54.917532 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:54Z","lastTransitionTime":"2025-10-12T20:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.019976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.020010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.020019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.020032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.020041 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.046689 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.047330 4773 scope.go:117] "RemoveContainer" containerID="b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40" Oct 12 20:24:55 crc kubenswrapper[4773]: E1012 20:24:55.047457 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.122904 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.122946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.122957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.122976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.122988 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.225961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.226027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.226040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.226055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.226065 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.328344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.328444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.328462 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.328933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.328992 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.432310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.432350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.432361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.432378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.432389 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.480937 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.480975 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:55 crc kubenswrapper[4773]: E1012 20:24:55.481046 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:24:55 crc kubenswrapper[4773]: E1012 20:24:55.481382 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.535292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.535335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.535352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.535373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.535391 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.550585 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:55 crc kubenswrapper[4773]: E1012 20:24:55.550893 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:55 crc kubenswrapper[4773]: E1012 20:24:55.550979 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs podName:a0e0fa58-fcd9-4002-a975-a98fcba0f364 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:03.550956808 +0000 UTC m=+51.787255408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs") pod "network-metrics-daemon-6sbfz" (UID: "a0e0fa58-fcd9-4002-a975-a98fcba0f364") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.637653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.637761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.637790 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.637820 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.637843 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.740663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.740746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.740766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.740787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.740803 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.844548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.844598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.844618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.844643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.844661 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.947745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.947779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.947788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.947801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:55 crc kubenswrapper[4773]: I1012 20:24:55.947810 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:55Z","lastTransitionTime":"2025-10-12T20:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.051603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.051950 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.052005 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.052029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.052329 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.155429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.155509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.155528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.155553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.155571 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.258326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.258387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.258406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.258431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.258447 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.361072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.361119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.361129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.361148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.361161 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.464454 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.464538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.464555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.464579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.464597 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.480996 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.481081 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:56 crc kubenswrapper[4773]: E1012 20:24:56.481289 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:56 crc kubenswrapper[4773]: E1012 20:24:56.481590 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.566654 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.566690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.566698 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.566712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.566735 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.669469 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.669509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.669520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.669538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.669551 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.771923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.771961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.771972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.771987 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.771997 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.876036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.876088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.876099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.876116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.876127 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.978675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.978746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.978758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.978771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:56 crc kubenswrapper[4773]: I1012 20:24:56.978780 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:56Z","lastTransitionTime":"2025-10-12T20:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.082032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.082080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.082092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.082114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.082126 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:57Z","lastTransitionTime":"2025-10-12T20:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.184868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.184910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.184922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.184939 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.184953 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:57Z","lastTransitionTime":"2025-10-12T20:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.287024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.287054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.287062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.287075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.287087 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:57Z","lastTransitionTime":"2025-10-12T20:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.389802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.389838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.389847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.389862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.389872 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:57Z","lastTransitionTime":"2025-10-12T20:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.480564 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.480909 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:57 crc kubenswrapper[4773]: E1012 20:24:57.481036 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:57 crc kubenswrapper[4773]: E1012 20:24:57.481202 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.492148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.492219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.492229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.492243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.492254 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:57Z","lastTransitionTime":"2025-10-12T20:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.594053 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.594110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.594128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.594152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.594169 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:57Z","lastTransitionTime":"2025-10-12T20:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.696896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.696944 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.696961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.696984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.697002 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:57Z","lastTransitionTime":"2025-10-12T20:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.799708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.799817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.799841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.799896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.799918 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:57Z","lastTransitionTime":"2025-10-12T20:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.902995 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.903047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.903073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.903096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:57 crc kubenswrapper[4773]: I1012 20:24:57.903114 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:57Z","lastTransitionTime":"2025-10-12T20:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.005932 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.005981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.005991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.006006 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.006017 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.108613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.108684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.109032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.109070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.109096 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.211419 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.211449 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.211457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.211469 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.211477 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.314327 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.314356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.314366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.314378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.314386 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.417472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.417530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.417550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.417574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.417592 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.480345 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:24:58 crc kubenswrapper[4773]: E1012 20:24:58.480482 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.480656 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:24:58 crc kubenswrapper[4773]: E1012 20:24:58.480701 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.521203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.521270 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.521288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.521316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.521335 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.625110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.625174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.625191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.625219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.625238 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.727556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.727599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.727609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.727623 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.727634 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.830308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.830345 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.830352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.830366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.830375 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.934501 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.934550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.934567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.934589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:58 crc kubenswrapper[4773]: I1012 20:24:58.934606 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:58Z","lastTransitionTime":"2025-10-12T20:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.038541 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.038599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.038619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.038648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.038671 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.141902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.142643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.142672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.142696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.142743 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.181495 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.193396 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.205990 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.229414 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.247164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.247268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.247294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.247366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.247389 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.251407 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.271641 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.294521 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.310764 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.329760 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.351650 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.351701 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.351752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.351777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.351792 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.356309 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.388996 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.411353 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.429177 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.448554 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.453610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.453656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.453673 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.453694 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.453709 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.470226 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.481095 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:24:59 crc kubenswrapper[4773]: E1012 20:24:59.481232 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.481115 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:24:59 crc kubenswrapper[4773]: E1012 20:24:59.481810 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.490463 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.508923 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.524457 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.526613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.526657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.526670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.526692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.526704 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.543488 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: E1012 20:24:59.547855 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.553272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.553338 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.553363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.553394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.553416 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: E1012 20:24:59.575383 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.581147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.581191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.581204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.581223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.581236 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: E1012 20:24:59.605858 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.611345 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.611432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.611451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.611504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.611523 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: E1012 20:24:59.633947 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.638292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.638330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.638340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.638381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.638393 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: E1012 20:24:59.658660 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:24:59Z is after 2025-08-24T17:21:41Z" Oct 12 20:24:59 crc kubenswrapper[4773]: E1012 20:24:59.658928 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.660972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.661018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.661037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.661063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.661080 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.763766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.763810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.763823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.763840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.763853 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.866407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.866444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.866455 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.866470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.866482 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.969318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.969373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.969394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.969424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:24:59 crc kubenswrapper[4773]: I1012 20:24:59.969456 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:24:59Z","lastTransitionTime":"2025-10-12T20:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.072582 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.072632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.072646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.072670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.072683 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:00Z","lastTransitionTime":"2025-10-12T20:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.177868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.177922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.177935 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.177953 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.178881 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:00Z","lastTransitionTime":"2025-10-12T20:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.281456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.281554 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.281569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.281584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.281615 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:00Z","lastTransitionTime":"2025-10-12T20:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.385172 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.385207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.385239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.385257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.385270 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:00Z","lastTransitionTime":"2025-10-12T20:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.480906 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:00 crc kubenswrapper[4773]: E1012 20:25:00.481087 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.481681 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:00 crc kubenswrapper[4773]: E1012 20:25:00.481827 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.487453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.487531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.487555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.487587 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.487610 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:00Z","lastTransitionTime":"2025-10-12T20:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.590808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.590865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.590877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.590892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.591234 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:00Z","lastTransitionTime":"2025-10-12T20:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.693817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.693854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.693864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.693879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.693888 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:00Z","lastTransitionTime":"2025-10-12T20:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.797182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.797243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.797266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.797298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.797321 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:00Z","lastTransitionTime":"2025-10-12T20:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.900746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.900807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.900823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.900845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:00 crc kubenswrapper[4773]: I1012 20:25:00.900862 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:00Z","lastTransitionTime":"2025-10-12T20:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.003361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.003425 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.003441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.003465 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.003483 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.106775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.106872 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.106898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.106926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.106947 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.210812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.210936 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.210971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.211046 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.211069 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.314118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.314172 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.314190 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.314213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.314230 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.417928 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.417990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.418000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.418033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.418046 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.480623 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.480623 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:01 crc kubenswrapper[4773]: E1012 20:25:01.480876 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:01 crc kubenswrapper[4773]: E1012 20:25:01.480969 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.520465 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.520683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.520834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.520937 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.521026 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.624181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.624226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.624236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.624254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.624265 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.727552 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.727902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.728157 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.728355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.728537 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.831624 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.831681 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.831698 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.831750 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.831769 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.934347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.934428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.934452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.934500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:01 crc kubenswrapper[4773]: I1012 20:25:01.934524 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:01Z","lastTransitionTime":"2025-10-12T20:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.037550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.038079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.038652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.039035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.039380 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.143360 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.143414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.143430 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.143456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.143474 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.245563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.245622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.245638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.245660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.245676 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.348807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.348882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.348906 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.348935 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.348954 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.450749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.451090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.451258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.451445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.454336 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.480239 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.480296 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:02 crc kubenswrapper[4773]: E1012 20:25:02.480419 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:02 crc kubenswrapper[4773]: E1012 20:25:02.480538 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.500838 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.520106 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.538429 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.554260 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.559070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.559101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.559111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.559125 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.559137 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.572597 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.587814 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.607175 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.627413 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.649800 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.661153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.661211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.661221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.661236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.661246 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.664610 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.678709 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.694947 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.716051 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.728255 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.740023 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.749968 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.761779 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.763216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.763434 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.764207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.764323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.764481 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.772622 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:02Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.867477 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.867562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.867612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.867639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.867657 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.971451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.971734 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.971824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.971905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:02 crc kubenswrapper[4773]: I1012 20:25:02.971987 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:02Z","lastTransitionTime":"2025-10-12T20:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.074584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.074640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.074656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.074681 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.074701 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:03Z","lastTransitionTime":"2025-10-12T20:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.177228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.177292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.177316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.177344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.177366 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:03Z","lastTransitionTime":"2025-10-12T20:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.280675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.280764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.280788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.280816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.280832 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:03Z","lastTransitionTime":"2025-10-12T20:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.383373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.383696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.383946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.384088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.384251 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:03Z","lastTransitionTime":"2025-10-12T20:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.480544 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.480544 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:03 crc kubenswrapper[4773]: E1012 20:25:03.480930 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:03 crc kubenswrapper[4773]: E1012 20:25:03.480757 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.488478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.488766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.489930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.490132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.490282 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:03Z","lastTransitionTime":"2025-10-12T20:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.593573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.593644 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.593660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.593685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.593702 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:03Z","lastTransitionTime":"2025-10-12T20:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.641621 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:03 crc kubenswrapper[4773]: E1012 20:25:03.641949 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:25:03 crc kubenswrapper[4773]: E1012 20:25:03.642090 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs podName:a0e0fa58-fcd9-4002-a975-a98fcba0f364 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:19.642060422 +0000 UTC m=+67.878359022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs") pod "network-metrics-daemon-6sbfz" (UID: "a0e0fa58-fcd9-4002-a975-a98fcba0f364") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.696573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.696629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.696646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.696671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.696688 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:03Z","lastTransitionTime":"2025-10-12T20:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.799055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.799123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.799150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.799176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.799196 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:03Z","lastTransitionTime":"2025-10-12T20:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.902085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.902168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.902193 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.902224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:03 crc kubenswrapper[4773]: I1012 20:25:03.902250 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:03Z","lastTransitionTime":"2025-10-12T20:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.005137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.005199 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.005215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.005245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.005263 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.108385 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.108441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.108459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.108481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.108502 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.147875 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.148020 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.148103 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:25:36.148077951 +0000 UTC m=+84.384376551 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.148157 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.148242 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:36.148215975 +0000 UTC m=+84.384514575 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.211472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.211544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.211563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.211589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.211613 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.249404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.249489 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.249593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.249602 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.249765 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:36.249697782 +0000 UTC m=+84.485996382 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.249794 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.249852 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.249881 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.249899 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.249909 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.249920 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.250005 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:36.24997591 +0000 UTC m=+84.486274500 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.250033 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:36.250021571 +0000 UTC m=+84.486320171 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.315191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.315245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.315262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.315333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.315354 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.418637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.418692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.418709 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.418770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.418793 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.481037 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.481270 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.481050 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:04 crc kubenswrapper[4773]: E1012 20:25:04.481806 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.522419 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.522483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.522503 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.522526 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.522547 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.625475 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.625537 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.625553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.625577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.625594 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.728280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.728354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.728376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.728406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.728427 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.830948 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.831039 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.831059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.831084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.831101 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.934024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.934065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.934081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.934102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:04 crc kubenswrapper[4773]: I1012 20:25:04.934118 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:04Z","lastTransitionTime":"2025-10-12T20:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.037890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.037938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.037955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.037980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.037999 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.140948 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.141042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.141061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.141084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.141139 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.247564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.247634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.247652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.247676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.247693 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.350321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.350391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.350410 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.350437 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.350456 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.452827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.452888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.452906 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.452930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.452948 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.480483 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.480549 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:05 crc kubenswrapper[4773]: E1012 20:25:05.480607 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:05 crc kubenswrapper[4773]: E1012 20:25:05.480675 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.555792 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.555848 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.555863 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.555883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.555897 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.659925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.659975 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.659994 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.660017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.660034 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.762699 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.762786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.762809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.762836 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.762858 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.866092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.866154 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.866171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.866197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.866222 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.969757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.969829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.969849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.969877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:05 crc kubenswrapper[4773]: I1012 20:25:05.969893 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:05Z","lastTransitionTime":"2025-10-12T20:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.072520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.072576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.072592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.072618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.072635 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.175112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.175147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.175154 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.175170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.175180 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.277767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.277797 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.277805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.277818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.277828 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.380334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.380369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.380379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.380394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.380402 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.480364 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:06 crc kubenswrapper[4773]: E1012 20:25:06.480563 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.480675 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:06 crc kubenswrapper[4773]: E1012 20:25:06.481438 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.481770 4773 scope.go:117] "RemoveContainer" containerID="b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.483826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.483868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.483881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.483899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.483912 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.586421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.586793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.586810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.586828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.587174 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.690261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.690329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.690354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.690386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.690409 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.793452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.793517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.793544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.794210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.794259 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.799280 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/1.log" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.802038 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.802331 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.819112 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.832977 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.847496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.864186 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.882236 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.896961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.896993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.897001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.897015 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.897023 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.906414 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.923835 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.943768 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.963383 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.985662 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:06Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.998967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.999007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.999015 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.999031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:06 crc kubenswrapper[4773]: I1012 20:25:06.999040 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:06Z","lastTransitionTime":"2025-10-12T20:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.011402 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.036041 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.052018 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.065337 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.074755 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.083066 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.093544 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.101227 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.101266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.101273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.101288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.101296 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:07Z","lastTransitionTime":"2025-10-12T20:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.109099 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.203840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.203872 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.203880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.203892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.203902 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:07Z","lastTransitionTime":"2025-10-12T20:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.306356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.306388 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.306396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.306410 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.306419 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:07Z","lastTransitionTime":"2025-10-12T20:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.408174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.408208 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.408216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.408231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.408240 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:07Z","lastTransitionTime":"2025-10-12T20:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.480477 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.480506 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:07 crc kubenswrapper[4773]: E1012 20:25:07.480613 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:07 crc kubenswrapper[4773]: E1012 20:25:07.480988 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.510765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.510792 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.510800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.510812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.510821 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:07Z","lastTransitionTime":"2025-10-12T20:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.613244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.613300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.613309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.613324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.613333 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:07Z","lastTransitionTime":"2025-10-12T20:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.715636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.715702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.715740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.715766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.715782 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:07Z","lastTransitionTime":"2025-10-12T20:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.806578 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/2.log" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.807169 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/1.log" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.810379 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda" exitCode=1 Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.810407 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.810442 4773 scope.go:117] "RemoveContainer" containerID="b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.811689 4773 scope.go:117] "RemoveContainer" containerID="e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda" Oct 12 20:25:07 crc kubenswrapper[4773]: E1012 20:25:07.812009 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.817785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.817811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.817821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.817835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.817846 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:07Z","lastTransitionTime":"2025-10-12T20:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.830461 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.850493 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.870930 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.888089 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.907791 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.920021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.920080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.920098 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.920122 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.920144 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:07Z","lastTransitionTime":"2025-10-12T20:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.923326 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.935529 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.946180 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.957699 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.967357 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.981783 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:07 crc kubenswrapper[4773]: I1012 20:25:07.995152 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:07Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.008889 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.021370 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.022811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.022862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.022879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.022903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.022919 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.036216 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.047291 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.074558 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b224e005ae3669e4e408ccfe9870e1507cd9b426e16f86a8771fad725a9b1d40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:24:45Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI1012 20:24:45.673169 6119 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 20:24:45.673182 6119 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 20:24:45.673204 6119 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:24:45.673218 6119 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1012 20:24:45.673239 6119 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:24:45.673252 6119 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1012 20:24:45.673289 6119 factory.go:656] Stopping watch factory\\\\nI1012 20:24:45.673318 6119 handler.go:208] Removed *v1.Node event handler 7\\\\nI1012 20:24:45.673795 6119 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:24:45.673796 6119 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 20:24:45.673830 6119 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 20:24:45.673850 6119 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:24:45.673863 6119 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 20:24:45.673857 6119 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 20:24:45.673851 6119 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 20:24:45.673901 6119 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:24:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:07Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1012 20:25:07.421506 6361 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.421683 6361 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.426845 6361 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:07.426879 6361 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:07.426926 6361 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:07.427132 6361 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:07.428227 6361 factory.go:656] Stopping watch factory\\\\nI1012 20:25:07.432175 6361 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:07.432230 6361 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:07.432329 6361 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:07.432378 6361 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:07.432472 6361 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.094930 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.125593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.125660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.125674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.125734 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.125748 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.228707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.228779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.228795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.228815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.228831 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.331466 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.331508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.331526 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.331550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.331566 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.433759 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.433794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.433804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.433820 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.433831 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.480758 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:08 crc kubenswrapper[4773]: E1012 20:25:08.480962 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.481320 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:08 crc kubenswrapper[4773]: E1012 20:25:08.481553 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.536559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.536617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.536640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.536665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.536683 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.640036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.640093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.640110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.640132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.640149 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.743539 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.743601 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.743618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.743645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.743668 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.817439 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/2.log" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.823399 4773 scope.go:117] "RemoveContainer" containerID="e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda" Oct 12 20:25:08 crc kubenswrapper[4773]: E1012 20:25:08.823694 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.843153 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.846110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.846156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.846173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.846195 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.846211 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.864076 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.883199 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.902119 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.919307 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.936329 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.948593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.948649 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.948666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.948690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.948707 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:08Z","lastTransitionTime":"2025-10-12T20:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.960222 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:08 crc kubenswrapper[4773]: I1012 20:25:08.993092 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:07Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1012 20:25:07.421506 6361 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.421683 6361 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.426845 6361 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:07.426879 6361 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:07.426926 6361 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:07.427132 6361 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:07.428227 6361 factory.go:656] Stopping watch factory\\\\nI1012 20:25:07.432175 6361 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:07.432230 6361 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:07.432329 6361 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:07.432378 6361 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:07.432472 6361 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:08Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.019711 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.044436 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.052038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.052115 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.052134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.052183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.052202 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.066794 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.105536 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.127982 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.148293 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.154344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.154543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.154669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.154811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.154903 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.158940 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.169448 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.179416 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.189829 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.257310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.257588 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.257737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.257854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.258005 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.365302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.365351 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.365367 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.365392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.365411 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.468926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.469398 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.469464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.469555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.469645 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.480390 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.480484 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:09 crc kubenswrapper[4773]: E1012 20:25:09.480534 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:09 crc kubenswrapper[4773]: E1012 20:25:09.480684 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.572973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.573202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.573291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.573368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.573455 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.676258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.676737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.676811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.676874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.676956 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.780089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.780160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.780186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.780218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.780242 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.835554 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.835622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.835640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.835671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.835695 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: E1012 20:25:09.857615 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.863298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.863356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.863373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.863403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.863420 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: E1012 20:25:09.884013 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.888599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.888644 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.888662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.888685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.888702 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: E1012 20:25:09.908705 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.913375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.913420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.913436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.913460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.913477 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: E1012 20:25:09.930359 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.934710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.934795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.934813 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.934839 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.934858 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:09 crc kubenswrapper[4773]: E1012 20:25:09.955185 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:09Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:09 crc kubenswrapper[4773]: E1012 20:25:09.955432 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.957609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.957641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.957652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.957668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:09 crc kubenswrapper[4773]: I1012 20:25:09.957695 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:09Z","lastTransitionTime":"2025-10-12T20:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.060320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.060677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.060877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.061029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.061181 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.163203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.163259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.163275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.163301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.163318 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.266588 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.266641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.266656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.266678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.266694 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.369645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.369699 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.369792 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.369823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.369842 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.473209 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.473254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.473265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.473283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.473295 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.480555 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.480600 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:10 crc kubenswrapper[4773]: E1012 20:25:10.480746 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:10 crc kubenswrapper[4773]: E1012 20:25:10.480845 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.576408 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.576472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.576493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.576517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.576534 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.680054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.680141 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.680165 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.680199 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.680224 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.783642 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.783706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.783758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.783785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.783806 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.886920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.887033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.887050 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.887073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.887090 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.990014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.990063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.990079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.990138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:10 crc kubenswrapper[4773]: I1012 20:25:10.990156 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:10Z","lastTransitionTime":"2025-10-12T20:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.093033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.093085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.093102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.093128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.093147 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:11Z","lastTransitionTime":"2025-10-12T20:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.196938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.196991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.197008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.197032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.197048 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:11Z","lastTransitionTime":"2025-10-12T20:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.300033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.300083 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.300094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.300114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.300128 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:11Z","lastTransitionTime":"2025-10-12T20:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.403891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.403971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.403994 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.404025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.404052 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:11Z","lastTransitionTime":"2025-10-12T20:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.481101 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.481119 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:11 crc kubenswrapper[4773]: E1012 20:25:11.481448 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:11 crc kubenswrapper[4773]: E1012 20:25:11.481292 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.507448 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.507513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.507529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.507553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.507572 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:11Z","lastTransitionTime":"2025-10-12T20:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.611063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.611181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.611203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.611227 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.611248 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:11Z","lastTransitionTime":"2025-10-12T20:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.714589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.714650 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.714667 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.714693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.714739 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:11Z","lastTransitionTime":"2025-10-12T20:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.817891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.817952 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.817974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.818000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.818017 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:11Z","lastTransitionTime":"2025-10-12T20:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.921211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.921278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.921300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.921328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:11 crc kubenswrapper[4773]: I1012 20:25:11.921350 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:11Z","lastTransitionTime":"2025-10-12T20:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.024372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.024432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.024448 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.024473 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.024490 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.127662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.127751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.127775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.127803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.127824 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.230885 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.230961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.230985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.231010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.231028 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.333884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.333921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.333930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.333945 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.333957 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.435222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.435258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.435267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.435287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.435297 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.480065 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.480227 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:12 crc kubenswrapper[4773]: E1012 20:25:12.480452 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:12 crc kubenswrapper[4773]: E1012 20:25:12.480965 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.506781 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.524653 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.539748 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.540124 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.540195 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.540229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.540248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.540260 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.557444 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.570539 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.592357 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.609156 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.623577 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.633782 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.643230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.643414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.643525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.643617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.643697 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.645682 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.657422 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.675217 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.691917 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.705395 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.717487 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.727761 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.746875 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.747067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.747091 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.747101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.747133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.747143 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.774373 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:07Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1012 20:25:07.421506 6361 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.421683 6361 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.426845 6361 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:07.426879 6361 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:07.426926 6361 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:07.427132 6361 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:07.428227 6361 factory.go:656] Stopping watch factory\\\\nI1012 20:25:07.432175 6361 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:07.432230 6361 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:07.432329 6361 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:07.432378 6361 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:07.432472 6361 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:12Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.850920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.850982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.851003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.851033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.851057 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.953352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.953396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.953411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.953432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:12 crc kubenswrapper[4773]: I1012 20:25:12.953450 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:12Z","lastTransitionTime":"2025-10-12T20:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.056414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.056451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.056459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.056476 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.056487 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:13Z","lastTransitionTime":"2025-10-12T20:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.158832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.159065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.159156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.159273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.159348 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:13Z","lastTransitionTime":"2025-10-12T20:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.261253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.261308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.261325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.261348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.261365 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:13Z","lastTransitionTime":"2025-10-12T20:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.363629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.363662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.363670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.363683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.363691 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:13Z","lastTransitionTime":"2025-10-12T20:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.465640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.465685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.465695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.465710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.465735 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:13Z","lastTransitionTime":"2025-10-12T20:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.480563 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.480641 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:13 crc kubenswrapper[4773]: E1012 20:25:13.480698 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:13 crc kubenswrapper[4773]: E1012 20:25:13.480863 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.568135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.568200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.568218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.568243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.568259 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:13Z","lastTransitionTime":"2025-10-12T20:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.671010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.671068 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.671086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.671110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.671127 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:13Z","lastTransitionTime":"2025-10-12T20:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.773831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.773892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.773909 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.773933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.773951 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:13Z","lastTransitionTime":"2025-10-12T20:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.876567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.876617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.876630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.876647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:13 crc kubenswrapper[4773]: I1012 20:25:13.876660 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:13Z","lastTransitionTime":"2025-10-12T20:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.024595 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.024647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.024660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.024678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.024691 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.127495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.127542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.127560 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.127582 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.127598 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.232271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.232324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.232771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.232799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.232862 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.336371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.336419 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.336434 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.336451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.336463 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.438418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.438459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.438470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.438487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.438498 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.480261 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:14 crc kubenswrapper[4773]: E1012 20:25:14.480372 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.480254 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:14 crc kubenswrapper[4773]: E1012 20:25:14.480503 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.540996 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.541033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.541042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.541055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.541064 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.643971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.644028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.644048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.644073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.644092 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.746897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.746942 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.746952 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.746969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.746979 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.849416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.849464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.849473 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.849488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.849496 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.951989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.952045 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.952062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.952085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:14 crc kubenswrapper[4773]: I1012 20:25:14.952103 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:14Z","lastTransitionTime":"2025-10-12T20:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.054664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.054739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.054748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.054764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.054777 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.158048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.158104 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.158126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.158154 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.158176 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.261903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.262007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.262027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.262052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.262072 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.364798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.364854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.364866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.364884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.365316 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.468251 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.468290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.468301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.468320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.468333 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.480378 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.480412 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:15 crc kubenswrapper[4773]: E1012 20:25:15.480583 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:15 crc kubenswrapper[4773]: E1012 20:25:15.480766 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.571501 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.571552 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.571565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.571583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.571597 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.676642 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.676696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.676740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.676768 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.676784 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.779491 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.779542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.779553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.779569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.779583 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.881958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.882001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.882011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.882029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.882039 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.984447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.984489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.984499 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.984516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:15 crc kubenswrapper[4773]: I1012 20:25:15.984528 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:15Z","lastTransitionTime":"2025-10-12T20:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.087133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.087210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.087230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.087254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.087271 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:16Z","lastTransitionTime":"2025-10-12T20:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.194656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.194742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.194767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.194798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.194822 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:16Z","lastTransitionTime":"2025-10-12T20:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.297294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.297333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.297350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.297375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.297391 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:16Z","lastTransitionTime":"2025-10-12T20:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.399429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.399483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.399502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.399525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.399542 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:16Z","lastTransitionTime":"2025-10-12T20:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.481132 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:16 crc kubenswrapper[4773]: E1012 20:25:16.481331 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.481148 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:16 crc kubenswrapper[4773]: E1012 20:25:16.481529 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.501769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.501809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.501819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.501834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.501845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:16Z","lastTransitionTime":"2025-10-12T20:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.605309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.605345 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.605352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.605366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.605374 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:16Z","lastTransitionTime":"2025-10-12T20:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.707294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.707330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.707341 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.707356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.707367 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:16Z","lastTransitionTime":"2025-10-12T20:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.809579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.809610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.809619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.809633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.809642 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:16Z","lastTransitionTime":"2025-10-12T20:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.912515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.912574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.912593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.912616 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:16 crc kubenswrapper[4773]: I1012 20:25:16.912632 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:16Z","lastTransitionTime":"2025-10-12T20:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.015335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.015399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.015416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.015442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.015459 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.117212 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.117276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.117295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.117319 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.117335 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.219796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.219832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.219840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.219854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.219864 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.322010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.322056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.322068 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.322087 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.322100 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.424339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.424613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.424690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.424786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.424863 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.480701 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.480724 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:17 crc kubenswrapper[4773]: E1012 20:25:17.481080 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:17 crc kubenswrapper[4773]: E1012 20:25:17.481243 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.527288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.527323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.527333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.527351 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.527362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.629791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.629834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.629847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.629864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.629885 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.731904 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.731941 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.731956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.731993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.732005 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.834536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.834563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.834571 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.834583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.834594 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.936668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.936699 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.936730 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.936744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:17 crc kubenswrapper[4773]: I1012 20:25:17.936753 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:17Z","lastTransitionTime":"2025-10-12T20:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.039194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.039245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.039254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.039267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.039279 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.141470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.141520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.141529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.141543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.141552 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.243221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.243291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.243304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.243322 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.243333 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.344887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.344920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.344931 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.344945 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.344957 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.446506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.446544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.446554 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.446569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.446579 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.480120 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:18 crc kubenswrapper[4773]: E1012 20:25:18.480210 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.480121 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:18 crc kubenswrapper[4773]: E1012 20:25:18.480271 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.548380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.548416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.548428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.548442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.548452 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.649867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.649897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.649906 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.649919 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.649928 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.752972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.753012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.753020 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.753035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.753045 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.854813 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.854851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.854863 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.854877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.854887 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.957275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.957311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.957320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.957335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:18 crc kubenswrapper[4773]: I1012 20:25:18.957344 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:18Z","lastTransitionTime":"2025-10-12T20:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.059480 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.059526 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.059536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.059553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.059565 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.161521 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.161583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.161592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.161609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.161620 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.264134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.264167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.264174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.264187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.264196 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.366037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.366072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.366081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.366094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.366105 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.468178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.468215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.468224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.468237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.468248 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.480880 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.480896 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:19 crc kubenswrapper[4773]: E1012 20:25:19.480980 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:19 crc kubenswrapper[4773]: E1012 20:25:19.481049 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.570457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.570505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.570517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.570536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.570550 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.672668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.672710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.672748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.672767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.672780 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.714542 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:19 crc kubenswrapper[4773]: E1012 20:25:19.714685 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:25:19 crc kubenswrapper[4773]: E1012 20:25:19.714759 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs podName:a0e0fa58-fcd9-4002-a975-a98fcba0f364 nodeName:}" failed. No retries permitted until 2025-10-12 20:25:51.714742946 +0000 UTC m=+99.951041506 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs") pod "network-metrics-daemon-6sbfz" (UID: "a0e0fa58-fcd9-4002-a975-a98fcba0f364") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.774628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.774664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.774672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.774687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.774696 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.876455 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.876485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.876493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.876506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.876515 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.978405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.978439 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.978449 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.978463 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:19 crc kubenswrapper[4773]: I1012 20:25:19.978472 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:19Z","lastTransitionTime":"2025-10-12T20:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.080781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.080812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.080821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.080837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.080846 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.183684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.183757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.183769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.183788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.183800 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.286103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.286179 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.286196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.286218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.286260 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.308676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.308709 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.308734 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.308749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.308769 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: E1012 20:25:20.320511 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:20Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.324049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.324104 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.324116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.324153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.324166 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: E1012 20:25:20.339921 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:20Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.342347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.342375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.342384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.342399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.342409 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: E1012 20:25:20.356342 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:20Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.359570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.359627 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.359639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.359653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.359663 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: E1012 20:25:20.370550 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:20Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.373744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.373777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.373794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.373844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.373855 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: E1012 20:25:20.387180 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:20Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:20 crc kubenswrapper[4773]: E1012 20:25:20.387309 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.388662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.388696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.388707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.388743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.388753 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.480442 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:20 crc kubenswrapper[4773]: E1012 20:25:20.480604 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.480836 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:20 crc kubenswrapper[4773]: E1012 20:25:20.480894 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.491010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.491040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.491049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.491063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.491072 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.593004 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.593055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.593064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.593077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.593086 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.695821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.695865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.695876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.695894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.695906 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.797707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.797760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.797769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.797783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.797792 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.899834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.900004 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.900078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.900145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:20 crc kubenswrapper[4773]: I1012 20:25:20.900209 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:20Z","lastTransitionTime":"2025-10-12T20:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.002820 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.003086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.003147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.003211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.003273 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.105259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.105291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.105299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.105312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.105322 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.207411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.207452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.207464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.207499 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.207511 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.310080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.310117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.310128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.310142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.310153 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.412229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.412264 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.412276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.412292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.412303 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.481805 4773 scope.go:117] "RemoveContainer" containerID="e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda" Oct 12 20:25:21 crc kubenswrapper[4773]: E1012 20:25:21.482056 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.482242 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:21 crc kubenswrapper[4773]: E1012 20:25:21.482315 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.482448 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:21 crc kubenswrapper[4773]: E1012 20:25:21.482519 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.515139 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.515186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.515203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.515225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.515242 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.617993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.618038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.618049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.618067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.618080 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.720351 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.720384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.720392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.720406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.720418 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.822995 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.823025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.823035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.823047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.823056 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.863828 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/0.log" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.863929 4773 generic.go:334] "Generic (PLEG): container finished" podID="69ad9308-d890-40f4-9b73-fb4aad78ccd1" containerID="249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a" exitCode=1 Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.863980 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67c6h" event={"ID":"69ad9308-d890-40f4-9b73-fb4aad78ccd1","Type":"ContainerDied","Data":"249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.864509 4773 scope.go:117] "RemoveContainer" containerID="249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.877233 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:21Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.897808 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:21Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.915848 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:21Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.924931 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.924960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.924967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.924981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.924990 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:21Z","lastTransitionTime":"2025-10-12T20:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.932406 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:21Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.946181 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:21Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.961477 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:21Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.975002 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:21Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:21 crc kubenswrapper[4773]: I1012 20:25:21.990280 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:21Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.004231 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.017555 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.027665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.027735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.027753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.027809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.027827 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.033759 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.050436 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"2025-10-12T20:24:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0\\\\n2025-10-12T20:24:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0 to /host/opt/cni/bin/\\\\n2025-10-12T20:24:36Z [verbose] multus-daemon started\\\\n2025-10-12T20:24:36Z [verbose] Readiness Indicator file check\\\\n2025-10-12T20:25:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.074609 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:07Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1012 20:25:07.421506 6361 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.421683 6361 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.426845 6361 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:07.426879 6361 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:07.426926 6361 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:07.427132 6361 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:07.428227 6361 factory.go:656] Stopping watch factory\\\\nI1012 20:25:07.432175 6361 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:07.432230 6361 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:07.432329 6361 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:07.432378 6361 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:07.432472 6361 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.095742 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.111637 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.127830 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.129751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.129783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.129792 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.129808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.129816 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.140263 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.156130 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.232147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.232198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.232210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.232226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.232235 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.334046 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.334091 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.334100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.334113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.334127 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.436546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.436580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.436589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.436603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.436614 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.480363 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:22 crc kubenswrapper[4773]: E1012 20:25:22.480477 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.480533 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:22 crc kubenswrapper[4773]: E1012 20:25:22.480670 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.498003 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"2025-10-12T20:24:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0\\\\n2025-10-12T20:24:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0 to /host/opt/cni/bin/\\\\n2025-10-12T20:24:36Z [verbose] multus-daemon started\\\\n2025-10-12T20:24:36Z [verbose] Readiness Indicator file check\\\\n2025-10-12T20:25:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.517630 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:07Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1012 20:25:07.421506 6361 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.421683 6361 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.426845 6361 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:07.426879 6361 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:07.426926 6361 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:07.427132 6361 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:07.428227 6361 factory.go:656] Stopping watch factory\\\\nI1012 20:25:07.432175 6361 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:07.432230 6361 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:07.432329 6361 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:07.432378 6361 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:07.432472 6361 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.529017 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.539472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.539522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.539535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.539554 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.539567 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.541495 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.553978 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.585201 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.600407 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.612457 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.624331 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.632992 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.641557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.641601 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.641609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.641627 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.641637 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.642282 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.652008 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.662189 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.672427 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.681290 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.691961 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.705087 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.717401 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.743741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.743914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.744027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.744148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.744258 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.847620 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.847652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.847660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.847672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.847681 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.869354 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/0.log" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.869435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67c6h" event={"ID":"69ad9308-d890-40f4-9b73-fb4aad78ccd1","Type":"ContainerStarted","Data":"4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.887154 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.904761 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.917110 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.930930 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.941235 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.954684 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.955027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.955070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.955079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.955094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.955103 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:22Z","lastTransitionTime":"2025-10-12T20:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.967788 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"2025-10-12T20:24:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0\\\\n2025-10-12T20:24:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0 to /host/opt/cni/bin/\\\\n2025-10-12T20:24:36Z [verbose] multus-daemon started\\\\n2025-10-12T20:24:36Z [verbose] Readiness Indicator file check\\\\n2025-10-12T20:25:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:22 crc kubenswrapper[4773]: I1012 20:25:22.994105 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:07Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1012 20:25:07.421506 6361 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.421683 6361 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.426845 6361 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:07.426879 6361 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:07.426926 6361 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:07.427132 6361 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:07.428227 6361 factory.go:656] Stopping watch factory\\\\nI1012 20:25:07.432175 6361 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:07.432230 6361 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:07.432329 6361 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:07.432378 6361 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:07.432472 6361 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:22Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.010335 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.029577 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.041610 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.053278 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.057205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.057244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.057255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.057272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.057283 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.071113 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.081494 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.097504 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.108904 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.117421 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.133447 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:23Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.160146 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.160207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.160221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.160240 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.160254 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.262281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.262444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.262529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.262622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.262731 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.364582 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.364617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.364626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.364638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.364649 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.467080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.467118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.467130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.467147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.467158 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.480471 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.480508 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:23 crc kubenswrapper[4773]: E1012 20:25:23.480562 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:23 crc kubenswrapper[4773]: E1012 20:25:23.480706 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.569555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.569599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.569616 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.569639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.569656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.672021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.672169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.672190 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.672215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.672234 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.774613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.774653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.774664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.774678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.774689 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.877659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.877697 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.877735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.877757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.877774 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.979762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.979791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.979801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.979818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:23 crc kubenswrapper[4773]: I1012 20:25:23.979829 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:23Z","lastTransitionTime":"2025-10-12T20:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.081774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.081820 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.081831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.081848 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.081860 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:24Z","lastTransitionTime":"2025-10-12T20:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.184246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.184277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.184287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.184301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.184309 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:24Z","lastTransitionTime":"2025-10-12T20:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.286656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.286704 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.286740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.286774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.286787 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:24Z","lastTransitionTime":"2025-10-12T20:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.389600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.389632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.389642 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.389656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.389664 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:24Z","lastTransitionTime":"2025-10-12T20:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.481923 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:24 crc kubenswrapper[4773]: E1012 20:25:24.482043 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.481938 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:24 crc kubenswrapper[4773]: E1012 20:25:24.482223 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.491829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.491871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.491886 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.491903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.491915 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:24Z","lastTransitionTime":"2025-10-12T20:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.593902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.593926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.593934 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.593946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.593954 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:24Z","lastTransitionTime":"2025-10-12T20:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.695702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.695954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.696059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.696143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.696217 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:24Z","lastTransitionTime":"2025-10-12T20:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.798767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.798800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.798807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.798821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.798829 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:24Z","lastTransitionTime":"2025-10-12T20:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.901053 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.901093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.901103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.901119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:24 crc kubenswrapper[4773]: I1012 20:25:24.901131 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:24Z","lastTransitionTime":"2025-10-12T20:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.003173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.003205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.003213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.003226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.003233 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.105021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.105054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.105063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.105077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.105086 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.207945 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.207986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.207996 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.208011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.208022 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.310702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.310766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.310777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.310796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.310811 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.413230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.413269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.413278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.413292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.413303 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.480129 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.480185 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:25 crc kubenswrapper[4773]: E1012 20:25:25.480238 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:25 crc kubenswrapper[4773]: E1012 20:25:25.480355 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.516485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.516525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.516536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.516550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.516569 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.619338 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.619379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.619387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.619403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.619412 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.721821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.721878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.721898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.721923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.721941 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.824789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.824817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.824824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.824838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.824850 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.927359 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.927393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.927401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.927413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:25 crc kubenswrapper[4773]: I1012 20:25:25.927425 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:25Z","lastTransitionTime":"2025-10-12T20:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.030526 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.030625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.030647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.030675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.030695 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.133324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.133350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.133357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.133383 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.133392 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.235578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.235621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.235632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.235654 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.235668 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.338228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.338262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.338270 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.338290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.338300 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.440565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.440623 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.440632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.440646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.440658 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.480357 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.480358 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:26 crc kubenswrapper[4773]: E1012 20:25:26.480779 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:26 crc kubenswrapper[4773]: E1012 20:25:26.480903 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.566011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.566397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.566590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.566879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.567089 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.670784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.671148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.671289 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.671412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.671593 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.774999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.775057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.775075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.775098 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.775116 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.877569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.877912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.878084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.878216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.878332 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.981757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.981810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.981823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.981839 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:26 crc kubenswrapper[4773]: I1012 20:25:26.981852 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:26Z","lastTransitionTime":"2025-10-12T20:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.085403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.085445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.085472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.085491 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.085500 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:27Z","lastTransitionTime":"2025-10-12T20:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.191617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.191669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.191698 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.191756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.191774 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:27Z","lastTransitionTime":"2025-10-12T20:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.294577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.294647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.294665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.294690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.294707 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:27Z","lastTransitionTime":"2025-10-12T20:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.397809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.397866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.397882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.397905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.397923 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:27Z","lastTransitionTime":"2025-10-12T20:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.480615 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:27 crc kubenswrapper[4773]: E1012 20:25:27.480817 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.481119 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:27 crc kubenswrapper[4773]: E1012 20:25:27.481208 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.500922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.500972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.500990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.501013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.501031 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:27Z","lastTransitionTime":"2025-10-12T20:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.603564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.603635 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.603658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.603690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.603747 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:27Z","lastTransitionTime":"2025-10-12T20:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.706510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.706561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.706578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.706604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.706624 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:27Z","lastTransitionTime":"2025-10-12T20:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.808705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.808749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.808757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.808771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.808781 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:27Z","lastTransitionTime":"2025-10-12T20:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.911741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.911765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.911773 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.911785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:27 crc kubenswrapper[4773]: I1012 20:25:27.911794 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:27Z","lastTransitionTime":"2025-10-12T20:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.015804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.015854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.015865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.015883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.015895 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.117807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.117834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.117842 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.117853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.117861 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.220216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.220295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.220319 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.220348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.220369 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.323054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.323100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.323111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.323129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.323142 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.425610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.425671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.425688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.425764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.425792 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.480239 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:28 crc kubenswrapper[4773]: E1012 20:25:28.480368 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.480463 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:28 crc kubenswrapper[4773]: E1012 20:25:28.480649 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.529481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.529567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.529584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.529604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.529620 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.632632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.632703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.633030 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.633059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.633076 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.735660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.735756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.735771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.735791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.735804 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.838945 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.838997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.839012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.839029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.839042 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.941755 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.941824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.941849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.941881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:28 crc kubenswrapper[4773]: I1012 20:25:28.941904 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:28Z","lastTransitionTime":"2025-10-12T20:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.045040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.045129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.045148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.045175 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.045232 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.149986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.150129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.150153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.150182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.150201 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.253640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.253746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.253767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.253792 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.253808 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.359164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.359244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.359267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.359300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.359321 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.462465 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.462529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.462545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.462568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.462608 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.480420 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.480527 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:29 crc kubenswrapper[4773]: E1012 20:25:29.480581 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:29 crc kubenswrapper[4773]: E1012 20:25:29.480699 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.564952 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.564999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.565016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.565040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.565057 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.668215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.668261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.668281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.668305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.668321 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.770956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.771022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.771043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.771074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.771095 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.874367 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.874438 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.874458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.874490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.874558 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.977738 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.978149 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.978296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.978449 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:29 crc kubenswrapper[4773]: I1012 20:25:29.978575 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:29Z","lastTransitionTime":"2025-10-12T20:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.081493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.082469 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.082617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.082785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.082943 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.185409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.185467 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.185485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.185510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.185527 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.289232 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.289674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.289976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.290189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.290336 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.394006 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.395135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.395309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.395802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.395961 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.480085 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.480250 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:30 crc kubenswrapper[4773]: E1012 20:25:30.480673 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:30 crc kubenswrapper[4773]: E1012 20:25:30.480918 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.498463 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.498585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.498603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.498626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.498643 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.601560 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.601604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.601623 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.601648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.601665 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.693933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.693978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.693990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.694008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.694019 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: E1012 20:25:30.716023 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:30Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.722282 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.722516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.722683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.722998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.723154 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: E1012 20:25:30.746330 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:30Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.753989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.754035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.754047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.754066 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.754078 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: E1012 20:25:30.775663 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:30Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.780979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.781176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.781262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.781363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.781464 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: E1012 20:25:30.798932 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:30Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.803708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.803905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.803993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.804078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.804179 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: E1012 20:25:30.821203 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:30Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:30 crc kubenswrapper[4773]: E1012 20:25:30.821440 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.824055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.824229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.824314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.824405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.824528 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.927619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.928108 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.928228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.928313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:30 crc kubenswrapper[4773]: I1012 20:25:30.928389 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:30Z","lastTransitionTime":"2025-10-12T20:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.032059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.032456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.032608 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.032776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.032905 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.136930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.136992 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.137016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.137070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.137092 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.239209 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.239266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.239279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.239318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.239343 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.341871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.342155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.342249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.342381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.342495 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.445402 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.445475 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.445491 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.445512 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.445526 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.480877 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:31 crc kubenswrapper[4773]: E1012 20:25:31.481009 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.480885 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:31 crc kubenswrapper[4773]: E1012 20:25:31.481138 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.548458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.548504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.548518 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.548539 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.548551 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.656164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.656295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.656405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.656444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.656478 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.760293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.760691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.760916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.761138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.761326 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.865120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.865524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.865775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.866028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.866236 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.970158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.970225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.970250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.970280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:31 crc kubenswrapper[4773]: I1012 20:25:31.970301 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:31Z","lastTransitionTime":"2025-10-12T20:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.073892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.073953 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.073969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.073990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.074006 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:32Z","lastTransitionTime":"2025-10-12T20:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.177517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.177587 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.177611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.177641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.177664 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:32Z","lastTransitionTime":"2025-10-12T20:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.280864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.281215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.281457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.281749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.281984 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:32Z","lastTransitionTime":"2025-10-12T20:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.384851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.385217 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.385660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.385835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.385957 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:32Z","lastTransitionTime":"2025-10-12T20:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.481019 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:32 crc kubenswrapper[4773]: E1012 20:25:32.481122 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.481266 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:32 crc kubenswrapper[4773]: E1012 20:25:32.481311 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.487795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.487830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.487844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.487916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.487932 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:32Z","lastTransitionTime":"2025-10-12T20:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.498997 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.515571 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.531394 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.544951 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.558994 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.570546 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.587340 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"2025-10-12T20:24:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0\\\\n2025-10-12T20:24:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0 to /host/opt/cni/bin/\\\\n2025-10-12T20:24:36Z [verbose] multus-daemon started\\\\n2025-10-12T20:24:36Z [verbose] Readiness Indicator file check\\\\n2025-10-12T20:25:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.592797 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.592843 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.592857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.592876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.592890 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:32Z","lastTransitionTime":"2025-10-12T20:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.620779 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:07Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1012 20:25:07.421506 6361 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.421683 6361 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.426845 6361 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:07.426879 6361 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:07.426926 6361 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:07.427132 6361 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:07.428227 6361 factory.go:656] Stopping watch factory\\\\nI1012 20:25:07.432175 6361 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:07.432230 6361 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:07.432329 6361 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:07.432378 6361 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:07.432472 6361 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.644452 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.660704 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.676980 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.690583 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.695582 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.695640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.695660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.695692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.695713 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:32Z","lastTransitionTime":"2025-10-12T20:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.741207 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.773768 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.791272 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.798638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.798681 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.798690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.798708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.798738 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:32Z","lastTransitionTime":"2025-10-12T20:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.806304 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.817755 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.827965 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:32Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.901218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.901260 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.901272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.901290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:32 crc kubenswrapper[4773]: I1012 20:25:32.901302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:32Z","lastTransitionTime":"2025-10-12T20:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.004172 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.004229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.004247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.004274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.004291 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.108124 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.108546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.108686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.108866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.109171 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.212232 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.212298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.212317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.212343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.212360 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.315569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.315631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.315648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.315673 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.315693 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.419574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.419650 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.419676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.419705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.419800 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.480864 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.480997 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:33 crc kubenswrapper[4773]: E1012 20:25:33.481150 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:33 crc kubenswrapper[4773]: E1012 20:25:33.481501 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.523903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.523973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.523991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.524019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.524040 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.626594 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.626645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.626656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.626674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.626688 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.729900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.729955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.729973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.730000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.730019 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.832364 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.832416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.832429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.832445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.832457 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.935639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.935699 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.935737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.935762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:33 crc kubenswrapper[4773]: I1012 20:25:33.935780 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:33Z","lastTransitionTime":"2025-10-12T20:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.038622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.038669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.038682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.038701 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.038743 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.141838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.141885 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.141894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.141911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.141922 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.244932 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.245013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.245037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.245070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.245097 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.346843 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.346878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.346888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.346904 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.346914 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.449846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.449874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.449883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.449898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.449909 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.480096 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.480156 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:34 crc kubenswrapper[4773]: E1012 20:25:34.480349 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:34 crc kubenswrapper[4773]: E1012 20:25:34.480498 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.552695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.552744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.552756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.552769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.552778 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.655310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.655356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.655370 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.655390 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.655404 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.759616 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.759652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.759660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.759679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.759689 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.864174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.864581 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.864949 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.865152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.865315 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.967604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.967680 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.967691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.967704 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:34 crc kubenswrapper[4773]: I1012 20:25:34.967730 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:34Z","lastTransitionTime":"2025-10-12T20:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.070419 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.070807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.071009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.071189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.071515 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.174801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.175187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.175332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.175474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.175603 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.279233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.279566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.279583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.279606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.279623 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.382074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.382131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.382152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.382184 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.382206 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.480339 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.480379 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:35 crc kubenswrapper[4773]: E1012 20:25:35.481022 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:35 crc kubenswrapper[4773]: E1012 20:25:35.481194 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.484693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.484742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.484752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.484766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.484777 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.586972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.587032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.587050 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.587074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.587092 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.690008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.690080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.690095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.690112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.690124 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.792014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.792044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.792072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.792086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.792094 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.894180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.894248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.894269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.894301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.894325 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.997277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.997323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.997337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.997354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:35 crc kubenswrapper[4773]: I1012 20:25:35.997367 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:35Z","lastTransitionTime":"2025-10-12T20:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.099508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.099579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.099599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.099624 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.099642 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:36Z","lastTransitionTime":"2025-10-12T20:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.187979 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.188197 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.18815678 +0000 UTC m=+148.424455380 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.188431 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.188572 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.188649 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.188630893 +0000 UTC m=+148.424929503 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.201779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.201841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.201857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.201881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.201897 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:36Z","lastTransitionTime":"2025-10-12T20:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.290019 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.290573 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.290316 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.290701 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.290665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.290764 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.290846 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.290820233 +0000 UTC m=+148.527118833 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.290878 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.291029 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.290995638 +0000 UTC m=+148.527294238 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.291125 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.291194 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.291249 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.291353 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.291338067 +0000 UTC m=+148.527636627 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.305042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.305080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.305093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.305109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.305121 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:36Z","lastTransitionTime":"2025-10-12T20:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.407864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.407900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.407910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.407927 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.407939 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:36Z","lastTransitionTime":"2025-10-12T20:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.480503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.480614 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.482014 4773 scope.go:117] "RemoveContainer" containerID="e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.482947 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:36 crc kubenswrapper[4773]: E1012 20:25:36.483171 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.511396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.511465 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.511489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.511521 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.511543 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:36Z","lastTransitionTime":"2025-10-12T20:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.613938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.614068 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.614132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.614203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.614263 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:36Z","lastTransitionTime":"2025-10-12T20:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.717636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.717679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.717695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.717832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.717859 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:36Z","lastTransitionTime":"2025-10-12T20:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.821328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.821417 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.821434 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.821458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.821507 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:36Z","lastTransitionTime":"2025-10-12T20:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.917004 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/2.log" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.919691 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.920812 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.930336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.930397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.930416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.930445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.930478 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:36Z","lastTransitionTime":"2025-10-12T20:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.939613 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.966733 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:36 crc kubenswrapper[4773]: I1012 20:25:36.983333 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.000505 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:36Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.011395 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.021008 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.031657 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"2025-10-12T20:24:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0\\\\n2025-10-12T20:24:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0 to /host/opt/cni/bin/\\\\n2025-10-12T20:24:36Z [verbose] multus-daemon started\\\\n2025-10-12T20:24:36Z [verbose] Readiness Indicator file check\\\\n2025-10-12T20:25:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.032406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.032445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.032454 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.032470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.032482 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.047162 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:07Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1012 20:25:07.421506 6361 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.421683 6361 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.426845 6361 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:07.426879 6361 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:07.426926 6361 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:07.427132 6361 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:07.428227 6361 factory.go:656] Stopping watch factory\\\\nI1012 20:25:07.432175 6361 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:07.432230 6361 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:07.432329 6361 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:07.432378 6361 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:07.432472 6361 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.065751 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.078445 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.089373 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.101037 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.113905 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.124298 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.134688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.134743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.134752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.134768 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.134776 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.137161 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.147529 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.160309 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.169647 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.237110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.237312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.237378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.237445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.237526 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.340594 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.340953 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.340963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.340979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.340992 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.443500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.443550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.443562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.443579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.443591 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.480998 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.481087 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:37 crc kubenswrapper[4773]: E1012 20:25:37.481222 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:37 crc kubenswrapper[4773]: E1012 20:25:37.481366 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.491233 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.545824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.545870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.545882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.545900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.545912 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.648316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.648583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.648646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.648734 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.648811 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.751492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.751801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.751870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.751939 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.752025 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.854618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.854976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.855148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.855311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.855447 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.925208 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/3.log" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.926503 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/2.log" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.929411 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" exitCode=1 Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.929494 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.929838 4773 scope.go:117] "RemoveContainer" containerID="e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.930468 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:25:37 crc kubenswrapper[4773]: E1012 20:25:37.930706 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.950011 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.957592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.957622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.957633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.957648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.957696 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:37Z","lastTransitionTime":"2025-10-12T20:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.969251 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.984136 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:37 crc kubenswrapper[4773]: I1012 20:25:37.997262 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:37Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.010877 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.027009 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.046318 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"2025-10-12T20:24:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0\\\\n2025-10-12T20:24:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0 to /host/opt/cni/bin/\\\\n2025-10-12T20:24:36Z [verbose] multus-daemon started\\\\n2025-10-12T20:24:36Z [verbose] Readiness Indicator file check\\\\n2025-10-12T20:25:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.060103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.060133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.060140 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.060153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.060162 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.072266 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e740f5c0f8e08f9cb4faa1c01791ab494d3a1741ab0c461bfbd3da7fe682efda\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:07Z\\\",\\\"message\\\":\\\"ient/pkg/client/informers/externalversions/factory.go:117\\\\nI1012 20:25:07.421506 6361 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.421683 6361 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 20:25:07.426845 6361 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:07.426879 6361 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:07.426926 6361 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:07.427132 6361 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:07.428227 6361 factory.go:656] Stopping watch factory\\\\nI1012 20:25:07.432175 6361 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:07.432230 6361 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:07.432329 6361 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:07.432378 6361 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:07.432472 6361 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269190 6752 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269271 6752 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269451 6752 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.275855 6752 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:37.275907 6752 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:37.275983 6752 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:37.276063 6752 factory.go:656] Stopping watch factory\\\\nI1012 20:25:37.276098 6752 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:37.340812 6752 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:37.340842 6752 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:37.340897 6752 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:37.340920 6752 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:37.341383 6752 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.102634 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.118875 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.135960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.153625 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.163222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.163258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.163266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.163279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.163288 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.171961 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.183082 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d1028c-04be-4fca-b4dc-d1af16989edb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54bb5a373141b25b2b8a2e3e3f1ee55b22d419210354ff015f9f188a44eb74be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.196520 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.211776 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.226766 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.241547 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.252840 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.265540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.265561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.265568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.265580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.265588 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.367997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.368033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.368043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.368057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.368068 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.471244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.471300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.471319 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.471342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.471360 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.480606 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.480625 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:38 crc kubenswrapper[4773]: E1012 20:25:38.480898 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:38 crc kubenswrapper[4773]: E1012 20:25:38.480934 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.573878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.573938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.573956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.573980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.573996 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.677218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.677279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.677298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.677326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.677346 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.780474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.780528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.780574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.780601 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.780619 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.883661 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.883708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.883755 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.883776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.883791 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.938137 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/3.log" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.945349 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:25:38 crc kubenswrapper[4773]: E1012 20:25:38.945605 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.962114 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.977903 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.991158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.991213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.991230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.991256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.991276 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:38Z","lastTransitionTime":"2025-10-12T20:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:38 crc kubenswrapper[4773]: I1012 20:25:38.993468 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:38Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.013247 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.032286 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.051060 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.069661 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"2025-10-12T20:24:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0\\\\n2025-10-12T20:24:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0 to /host/opt/cni/bin/\\\\n2025-10-12T20:24:36Z [verbose] multus-daemon started\\\\n2025-10-12T20:24:36Z [verbose] Readiness Indicator file check\\\\n2025-10-12T20:25:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.094372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.094461 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.094506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.094529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.094546 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:39Z","lastTransitionTime":"2025-10-12T20:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.103330 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269190 6752 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269271 6752 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269451 6752 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.275855 6752 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:37.275907 6752 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:37.275983 6752 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:37.276063 6752 factory.go:656] Stopping watch factory\\\\nI1012 20:25:37.276098 6752 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:37.340812 6752 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:37.340842 6752 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:37.340897 6752 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:37.340920 6752 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:37.341383 6752 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.121392 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.138500 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.171251 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.195282 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.197946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.197997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.198014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.198037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.198055 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:39Z","lastTransitionTime":"2025-10-12T20:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.218424 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.235638 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.249603 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.265300 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.290464 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d1028c-04be-4fca-b4dc-d1af16989edb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54bb5a373141b25b2b8a2e3e3f1ee55b22d419210354ff015f9f188a44eb74be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.302061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.302146 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.302170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.302202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.302227 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:39Z","lastTransitionTime":"2025-10-12T20:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.310482 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.328962 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:39Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.405814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.405866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.405882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.405905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.405922 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:39Z","lastTransitionTime":"2025-10-12T20:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.480226 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:39 crc kubenswrapper[4773]: E1012 20:25:39.480464 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.480848 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:39 crc kubenswrapper[4773]: E1012 20:25:39.481006 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.509127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.509189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.509206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.509234 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.509250 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:39Z","lastTransitionTime":"2025-10-12T20:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.612335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.612399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.612416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.612442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.612459 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:39Z","lastTransitionTime":"2025-10-12T20:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.715547 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.715610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.715628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.715653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.715677 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:39Z","lastTransitionTime":"2025-10-12T20:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.818893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.818966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.818984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.819040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.819058 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:39Z","lastTransitionTime":"2025-10-12T20:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.922659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.922754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.922778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.922805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:39 crc kubenswrapper[4773]: I1012 20:25:39.922824 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:39Z","lastTransitionTime":"2025-10-12T20:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.025926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.025968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.025980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.026002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.026015 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.129459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.129503 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.129520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.129543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.129562 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.231297 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.231330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.231343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.231360 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.231372 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.334575 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.334626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.334638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.334684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.334697 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.437935 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.438002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.438025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.438058 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.438081 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.481131 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.481369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:40 crc kubenswrapper[4773]: E1012 20:25:40.481506 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:40 crc kubenswrapper[4773]: E1012 20:25:40.481699 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.540607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.540665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.540689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.540763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.540789 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.644023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.644076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.644093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.644117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.644135 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.747758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.747811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.747828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.747851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.747868 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.850380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.850445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.850464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.850489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.850532 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.965087 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.965163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.965187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.965220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.965237 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.985124 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.985164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.985182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.985206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:40 crc kubenswrapper[4773]: I1012 20:25:40.985222 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:40Z","lastTransitionTime":"2025-10-12T20:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: E1012 20:25:41.016538 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.022318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.022373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.022393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.022416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.022432 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: E1012 20:25:41.040637 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.045845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.045944 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.045961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.045984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.046001 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: E1012 20:25:41.065693 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.071803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.071851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.071868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.071893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.071910 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: E1012 20:25:41.095765 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.100524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.100562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.100578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.100601 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.100620 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: E1012 20:25:41.119909 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:41Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:41 crc kubenswrapper[4773]: E1012 20:25:41.120176 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.122565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.122630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.122651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.122678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.122695 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.226229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.226325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.226344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.226367 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.226424 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.329196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.329278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.329299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.329324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.329342 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.432096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.432178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.432199 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.432228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.432250 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.481057 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.481155 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:41 crc kubenswrapper[4773]: E1012 20:25:41.481359 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:41 crc kubenswrapper[4773]: E1012 20:25:41.481532 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.534807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.534864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.534881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.534906 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.534923 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.638259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.638304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.638321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.638345 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.638362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.742249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.742315 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.742331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.742358 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.742376 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.844837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.844929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.844949 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.844973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.844990 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.948135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.948219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.948242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.948278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:41 crc kubenswrapper[4773]: I1012 20:25:41.948301 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:41Z","lastTransitionTime":"2025-10-12T20:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.051743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.051789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.051805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.051829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.051847 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.155319 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.155377 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.155395 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.155420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.155439 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.258954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.259021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.259040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.259065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.259087 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.361885 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.361927 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.361937 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.361956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.361970 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.467264 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.467328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.467348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.467378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.467395 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.480537 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.480590 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:42 crc kubenswrapper[4773]: E1012 20:25:42.480686 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:42 crc kubenswrapper[4773]: E1012 20:25:42.480901 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.500179 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.519080 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.539657 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.559312 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.571999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.572083 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.572152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.572185 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.572210 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.579619 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.605946 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.629152 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"2025-10-12T20:24:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0\\\\n2025-10-12T20:24:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0 to /host/opt/cni/bin/\\\\n2025-10-12T20:24:36Z [verbose] multus-daemon started\\\\n2025-10-12T20:24:36Z [verbose] Readiness Indicator file check\\\\n2025-10-12T20:25:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.663266 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269190 6752 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269271 6752 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269451 6752 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.275855 6752 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:37.275907 6752 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:37.275983 6752 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:37.276063 6752 factory.go:656] Stopping watch factory\\\\nI1012 20:25:37.276098 6752 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:37.340812 6752 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:37.340842 6752 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:37.340897 6752 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:37.340920 6752 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:37.341383 6752 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.675313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.675409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.675435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.675464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.675481 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.691777 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.728793 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.752056 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.774253 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.778438 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.778483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.778501 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.778525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.778545 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.802439 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.819989 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.837285 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.854594 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d1028c-04be-4fca-b4dc-d1af16989edb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54bb5a373141b25b2b8a2e3e3f1ee55b22d419210354ff015f9f188a44eb74be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.874403 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.883149 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.883218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.883241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.883272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.883294 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.901883 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.924496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:42Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.985448 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.985712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.985769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.985794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:42 crc kubenswrapper[4773]: I1012 20:25:42.985812 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:42Z","lastTransitionTime":"2025-10-12T20:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.089963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.090075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.090095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.090168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.090187 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:43Z","lastTransitionTime":"2025-10-12T20:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.194068 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.194134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.194146 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.194166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.194180 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:43Z","lastTransitionTime":"2025-10-12T20:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.298070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.298147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.298869 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.298917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.298939 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:43Z","lastTransitionTime":"2025-10-12T20:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.402421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.402487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.402506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.402536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.402554 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:43Z","lastTransitionTime":"2025-10-12T20:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.480967 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.481048 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:43 crc kubenswrapper[4773]: E1012 20:25:43.481193 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:43 crc kubenswrapper[4773]: E1012 20:25:43.481513 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.506374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.506453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.506478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.506506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.506526 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:43Z","lastTransitionTime":"2025-10-12T20:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.609072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.609130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.609147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.609175 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.609192 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:43Z","lastTransitionTime":"2025-10-12T20:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.713009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.713074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.713089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.713113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.713131 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:43Z","lastTransitionTime":"2025-10-12T20:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.816799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.816846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.816856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.816873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.816883 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:43Z","lastTransitionTime":"2025-10-12T20:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.919633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.919682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.919691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.919709 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:43 crc kubenswrapper[4773]: I1012 20:25:43.919741 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:43Z","lastTransitionTime":"2025-10-12T20:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.022583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.022633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.022650 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.022672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.022687 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.125684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.125734 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.125743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.125757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.125769 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.227853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.227924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.227936 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.227952 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.227961 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.331593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.331646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.331661 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.331682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.331695 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.434858 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.434930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.434954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.434981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.435000 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.480824 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.481104 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:44 crc kubenswrapper[4773]: E1012 20:25:44.481281 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:44 crc kubenswrapper[4773]: E1012 20:25:44.481427 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.539194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.539293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.539315 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.539348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.539383 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.643409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.643483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.643505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.643538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.643559 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.746953 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.747010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.747070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.747096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.747139 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.850523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.850588 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.850605 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.850634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.850651 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.954262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.954331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.954346 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.954369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:44 crc kubenswrapper[4773]: I1012 20:25:44.954385 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:44Z","lastTransitionTime":"2025-10-12T20:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.058375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.058439 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.058460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.058492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.058512 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.161940 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.162003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.162017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.162036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.162049 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.264954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.265018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.265036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.265060 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.265079 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.367840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.367889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.367903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.367923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.367935 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.471489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.471596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.471616 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.471646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.471669 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.480838 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.480838 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:45 crc kubenswrapper[4773]: E1012 20:25:45.481049 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:45 crc kubenswrapper[4773]: E1012 20:25:45.481230 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.575222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.575283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.575301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.575326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.575343 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.678894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.678936 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.678944 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.678958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.678967 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.781363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.781405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.781417 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.781435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.781445 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.883921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.883973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.883991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.884016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.884032 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.987163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.987211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.987231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.987257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:45 crc kubenswrapper[4773]: I1012 20:25:45.987275 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:45Z","lastTransitionTime":"2025-10-12T20:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.089430 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.089468 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.089479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.089494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.089505 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:46Z","lastTransitionTime":"2025-10-12T20:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.192417 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.192533 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.192560 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.192585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.192602 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:46Z","lastTransitionTime":"2025-10-12T20:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.295893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.295960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.295980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.296006 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.296026 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:46Z","lastTransitionTime":"2025-10-12T20:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.399546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.399609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.399626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.399649 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.399669 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:46Z","lastTransitionTime":"2025-10-12T20:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.480529 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.480576 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:46 crc kubenswrapper[4773]: E1012 20:25:46.480776 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:46 crc kubenswrapper[4773]: E1012 20:25:46.481011 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.503028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.503079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.503097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.503121 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.503138 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:46Z","lastTransitionTime":"2025-10-12T20:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.606565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.606692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.606709 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.606788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.606807 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:46Z","lastTransitionTime":"2025-10-12T20:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.710200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.710266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.710286 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.710316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.710339 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:46Z","lastTransitionTime":"2025-10-12T20:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.813352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.813402 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.813413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.813432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.813443 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:46Z","lastTransitionTime":"2025-10-12T20:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.916461 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.916569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.916592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.916673 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:46 crc kubenswrapper[4773]: I1012 20:25:46.916704 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:46Z","lastTransitionTime":"2025-10-12T20:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.020011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.020052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.020063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.020079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.020089 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.123784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.123856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.123877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.123903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.123922 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.227365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.227442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.227464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.227494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.227516 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.330613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.330692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.330718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.330785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.330804 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.434202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.434266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.434282 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.434306 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.434323 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.480088 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:47 crc kubenswrapper[4773]: E1012 20:25:47.480283 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.480099 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:47 crc kubenswrapper[4773]: E1012 20:25:47.480528 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.538205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.538357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.538375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.538398 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.538414 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.641228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.641285 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.641303 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.641328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.641344 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.744375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.744427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.744435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.744451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.744460 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.846615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.846652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.846661 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.846675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.846686 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.948743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.948781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.948789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.948804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:47 crc kubenswrapper[4773]: I1012 20:25:47.948858 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:47Z","lastTransitionTime":"2025-10-12T20:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.050838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.050901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.050921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.050943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.050960 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.153384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.153483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.153494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.153507 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.153515 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.256482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.256514 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.256525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.256540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.256551 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.358932 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.358962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.358973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.358989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.358999 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.461688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.461743 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.461752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.461767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.461776 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.480241 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:48 crc kubenswrapper[4773]: E1012 20:25:48.480443 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.480808 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:48 crc kubenswrapper[4773]: E1012 20:25:48.480960 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.564454 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.564502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.564514 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.564532 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.564546 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.666908 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.666978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.667003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.667032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.667073 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.769000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.769060 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.769083 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.769113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.769137 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.871970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.872029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.872054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.872080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.872101 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.974643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.974693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.974711 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.974791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:48 crc kubenswrapper[4773]: I1012 20:25:48.974811 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:48Z","lastTransitionTime":"2025-10-12T20:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.077943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.078019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.078038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.078061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.078077 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:49Z","lastTransitionTime":"2025-10-12T20:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.181280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.181341 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.181366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.181392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.181413 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:49Z","lastTransitionTime":"2025-10-12T20:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.284905 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.284971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.284993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.285024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.285045 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:49Z","lastTransitionTime":"2025-10-12T20:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.387862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.387929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.388010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.388048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.388072 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:49Z","lastTransitionTime":"2025-10-12T20:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.481077 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.481160 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:49 crc kubenswrapper[4773]: E1012 20:25:49.481254 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:49 crc kubenswrapper[4773]: E1012 20:25:49.481492 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.490777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.490818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.490833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.490858 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.490881 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:49Z","lastTransitionTime":"2025-10-12T20:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.594203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.594286 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.594309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.594337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.594358 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:49Z","lastTransitionTime":"2025-10-12T20:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.697029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.697173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.697203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.697233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.697254 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:49Z","lastTransitionTime":"2025-10-12T20:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.800584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.800640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.800662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.800690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.800712 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:49Z","lastTransitionTime":"2025-10-12T20:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.904127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.904292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.904322 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.904349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:49 crc kubenswrapper[4773]: I1012 20:25:49.904461 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:49Z","lastTransitionTime":"2025-10-12T20:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.006371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.006443 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.006466 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.006495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.006517 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.108901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.108938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.108947 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.108963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.108975 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.211320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.211493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.211511 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.211535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.211553 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.314475 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.314557 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.314574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.314600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.314615 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.417078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.417146 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.417161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.417175 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.417185 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.480907 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.480885 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:50 crc kubenswrapper[4773]: E1012 20:25:50.481123 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:50 crc kubenswrapper[4773]: E1012 20:25:50.481172 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.520637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.520693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.520710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.520761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.520828 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.623137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.623195 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.623212 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.623237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.623253 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.725917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.726431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.726455 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.726542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.726562 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.829656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.829771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.829797 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.829825 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.829846 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.933027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.933088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.933106 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.933131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:50 crc kubenswrapper[4773]: I1012 20:25:50.933148 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:50Z","lastTransitionTime":"2025-10-12T20:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.036161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.036220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.036239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.036264 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.036281 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.138812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.138859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.138871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.138892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.138906 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.241976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.242016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.242033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.242057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.242075 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.285659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.285735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.285752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.285776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.285794 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.307451 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:51Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.312772 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.312853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.312869 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.312892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.312940 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.334550 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:51Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.340109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.340206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.340235 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.340265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.340285 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.361585 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:51Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.366811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.366893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.366918 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.366951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.366980 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.388568 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:51Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.395054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.395124 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.395148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.395175 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.395205 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.416613 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee41ac78-6c3d-4e51-9248-43b3278b77da\\\",\\\"systemUUID\\\":\\\"e3f9af6f-71c8-4f60-9f2a-bd881e5b9c75\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:51Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.416914 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.419531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.419587 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.419604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.419628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.419650 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.480545 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.480571 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.481099 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.481445 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.522036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.522113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.522134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.522160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.522181 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.625800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.625912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.625929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.625954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.625970 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.729171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.729277 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.729305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.729339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.729363 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.772418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.772590 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:25:51 crc kubenswrapper[4773]: E1012 20:25:51.772657 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs podName:a0e0fa58-fcd9-4002-a975-a98fcba0f364 nodeName:}" failed. No retries permitted until 2025-10-12 20:26:55.772640538 +0000 UTC m=+164.008939098 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs") pod "network-metrics-daemon-6sbfz" (UID: "a0e0fa58-fcd9-4002-a975-a98fcba0f364") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.832620 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.832680 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.832706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.832761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.832783 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.935553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.935603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.935619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.935644 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:51 crc kubenswrapper[4773]: I1012 20:25:51.935661 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:51Z","lastTransitionTime":"2025-10-12T20:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.039592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.039663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.039685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.039710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.039762 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.143437 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.143511 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.143537 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.143567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.143588 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.246844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.247214 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.247385 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.247600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.247866 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.351051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.351149 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.351168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.351222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.351241 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.454544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.454608 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.454625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.454648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.454664 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.480193 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:52 crc kubenswrapper[4773]: E1012 20:25:52.480392 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.480420 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:52 crc kubenswrapper[4773]: E1012 20:25:52.480535 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.500443 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h94p2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34dec32f-3ec7-4897-bb63-9f8e018fe743\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0d2720290cd89928d3516cc5367004e395d0a96d3b9f27fd3e8e7a7e26b29a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8p8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h94p2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.519630 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3d1028c-04be-4fca-b4dc-d1af16989edb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54bb5a373141b25b2b8a2e3e3f1ee55b22d419210354ff015f9f188a44eb74be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e665309bb48cd63a016b3f0c391370371c388b7e999a00b50a9603bf0f5bcaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.539356 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da3fd257-2079-4c29-8de7-b094c176f6f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7167455b9b5d827d7aa735589250f92a389c08fc6ebdbe85993d49a242e2435e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b7c75f2ba7fc978b8dfb517ebcf948811bc9e680cbc26a45cf5cf0042568a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4e10d383683a890d6079b445b8f7fdac4036284f2b4033945073921ac0b185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09144776491564f4232bfac881d9e62fa7d4d78e05e830e052ab02ce1c49eb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.558288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.558382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.558400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.558468 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.558486 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.562016 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.578392 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e328baf784e0c656a64b87e87744d0e6e5ea7e3f84ce34fb1e81116fedaad99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.593825 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6sl6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b8397b6-40c2-4993-a5c7-a2afe63667ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9be71a9825e54345ced1e8eb8509e133e1b788535b141a79ecc483e389148a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrfzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6sl6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.611788 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0e0fa58-fcd9-4002-a975-a98fcba0f364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqvdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6sbfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.633065 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.652460 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ab1d07ce0e46794ebdc15d4a2f78716cb79e0a9b53492fd8b651bf2bf201fce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad83ff227a9a93cdfd7ce2e472237b7a31e25f479648e61b04e2261f5a916ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.661938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.662001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.662021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.662048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.662064 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.671052 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.690411 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad613ee54c9900c0120ca77fbafe550119a0b05c5b2bc6f684569826be650017\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lmg4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cbx9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.705475 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7322c897-b1e2-48d0-a8b9-3c22cc8a4fc1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f438cba9e0a4fb74219fc41d39d6967e501bea657c79601be29ba72e39022c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69589e61d26bedfa87be8f99808fa098a4437a9435b42a2eea3f5f9dc07eec8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgcjk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rnnqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.722990 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-67c6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ad9308-d890-40f4-9b73-fb4aad78ccd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:25:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:21Z\\\",\\\"message\\\":\\\"2025-10-12T20:24:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0\\\\n2025-10-12T20:24:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_877779ad-e3ae-467f-ba60-f07b3a5ddac0 to /host/opt/cni/bin/\\\\n2025-10-12T20:24:36Z [verbose] multus-daemon started\\\\n2025-10-12T20:24:36Z [verbose] Readiness Indicator file check\\\\n2025-10-12T20:25:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pm9pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-67c6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.742265 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd89b89-9347-4b0d-8861-4ff26c9640b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T20:25:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269190 6752 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269271 6752 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.269451 6752 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1012 20:25:37.275855 6752 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1012 20:25:37.275907 6752 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1012 20:25:37.275983 6752 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 20:25:37.276063 6752 factory.go:656] Stopping watch factory\\\\nI1012 20:25:37.276098 6752 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 20:25:37.340812 6752 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1012 20:25:37.340842 6752 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1012 20:25:37.340897 6752 ovnkube.go:599] Stopped ovnkube\\\\nI1012 20:25:37.340920 6752 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 20:25:37.341383 6752 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:25:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnldc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tzm6q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.764334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.764410 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.764428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.764450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.764466 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.769388 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21614af6-f49c-4238-a3db-26b9633443ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://931093e9a5c4f9e042db6c74d9a1b4254de1947779f64beeb43529e9fe359b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3286c41f584f9bb103fa0c6e52c0fb9c523575803b407e6ad60027a25a07425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75753bbbd9c655a3dcfe1651bc26d09c2353a93e591553215264c22b36e2760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b11aa9b5cc015aaec84590b5993a2bd6408fdc61b590362490e6e4a806e2a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d063baeef90403f8fbd28cb759a0df444507673e955a220cf97f5dc93b7e362f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11e16e2033a2c21ef506e08bf9f6b87666ddf1f4e6380e6a5a2e8d5991e6884\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41aadc28df57d261364d5c86e948fca3f7d87e7bdfa6b3cf55a3c6ff97ce776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0316ea49da48af3859474fbc1e3d014ff9731c89dcaf85f8e5071fcc3144e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.784995 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af48cfb-2c84-46ab-8042-1cc0f118baf9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f8a4f1bb8f6d61ee19b602ed6a65a1af28faeaa48e3f7e53778d438c4daeaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9075b03c0d182b76430be6b3d2d1a8d5d16ea0350c948a2e7aeb162f1c2bbf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f2ea7bd14392f0e80f5ab2040cc743f299d32a33e56c348e4c6ecba9c4f01e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1daaeaaef81ac3c7f9a4e149016fe72815cc69d08ad99bf025833de195c8a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2489328f6cd3b39528bd8314dfad27734b3709d9974a5485960601d74eb83c5a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T20:24:26Z\\\",\\\"message\\\":\\\"W1012 20:24:15.593413 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 20:24:15.594256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760300655 cert, and key in /tmp/serving-cert-2819721240/serving-signer.crt, /tmp/serving-cert-2819721240/serving-signer.key\\\\nI1012 20:24:15.998503 1 observer_polling.go:159] Starting file observer\\\\nW1012 20:24:16.000646 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 20:24:16.000802 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 20:24:16.001458 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2819721240/tls.crt::/tmp/serving-cert-2819721240/tls.key\\\\\\\"\\\\nF1012 20:24:26.321698 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7971ec201068a03bf6c8ba802fec5f12ba9c70b9615eda210a0a188c1b665b32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f57677d3c9785c7ae8aafa5ff9cbd80a297c6a2a259d529ac6517e7887157b1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.808584 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47ca61fe-1ab9-45c7-9d1c-14ff8f8eee5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c75dbaf8f3ae3412bb8f0a097ccb579091485d3bd93a1fdf6b69b40ca89dd502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c30c0c2e35fdb26fbe61bf9eb9601dfb2b409812c9c9a2455dc2fb8e60d5af4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d209a519927732823cf0beba6e84515877e72072fe9524f3a16cf0c67fea5c7a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b485b417dfe9fb6c6da3d7c85ca872b2ce954a709937a003644daf156af811c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.825943 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea2d536bccf43782423895121fd0ef85dd4caff6eb5758f8465027ed757aa4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.842481 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2d85a10-4066-430e-ac9f-533080da69f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T20:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1628cbbcf7181e6c993871a0876023b87b0ade22804e2169068f77db0b11f473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T20:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adba6d301fd16bf7b7b86104c5752011921061d8a070d37a0b35a34dacb6c4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2878f0ce6d01010f5efa1e8993cb8bea06646419bf79c427b366217ec26945aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6c1f196c2da7764aa35e3f3e05b3790ea9ea16648cd6c77f40b82708520a778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed6f1d4283c692ce2cea16ef5dce6e5e814b818c7f1d1c81e5b3431f2e2e7776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7a19129539cdf296316614b90218af564e7891bc0bb4d6432108c1fc6ab2ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://596721ebf702c8b4813010f11bd65173177fda30309fccde55b5bedd772d2ba4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T20:24:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T20:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g7lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T20:24:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jdcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T20:25:52Z is after 2025-08-24T17:21:41Z" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.867282 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.867321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.867332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.867349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.867360 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.970614 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.970663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.970680 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.970703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:52 crc kubenswrapper[4773]: I1012 20:25:52.970741 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:52Z","lastTransitionTime":"2025-10-12T20:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.073181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.073252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.073272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.073291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.073306 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.176059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.176179 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.176207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.176233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.176251 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.279672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.279798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.279824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.279854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.279877 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.383539 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.383614 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.383639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.383668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.383687 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.481121 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.481167 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:53 crc kubenswrapper[4773]: E1012 20:25:53.481371 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:53 crc kubenswrapper[4773]: E1012 20:25:53.482168 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.482686 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:25:53 crc kubenswrapper[4773]: E1012 20:25:53.483058 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.486847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.486907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.486989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.487070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.487100 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.591080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.591117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.591128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.591143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.591155 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.693078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.693114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.693122 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.693135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.693145 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.794972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.795011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.795023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.795037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.795050 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.897584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.897613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.897622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.897635 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.897644 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.999390 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.999416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.999423 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.999435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:53 crc kubenswrapper[4773]: I1012 20:25:53.999443 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:53Z","lastTransitionTime":"2025-10-12T20:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.102592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.102644 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.102656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.102675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.102687 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:54Z","lastTransitionTime":"2025-10-12T20:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.204570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.204609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.204621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.204637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.204649 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:54Z","lastTransitionTime":"2025-10-12T20:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.306791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.306828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.306838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.306853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.306863 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:54Z","lastTransitionTime":"2025-10-12T20:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.409851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.409903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.409924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.409943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.409955 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:54Z","lastTransitionTime":"2025-10-12T20:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.480629 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.480705 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:54 crc kubenswrapper[4773]: E1012 20:25:54.480836 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:54 crc kubenswrapper[4773]: E1012 20:25:54.480960 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.512105 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.512176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.512188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.512205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.512216 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:54Z","lastTransitionTime":"2025-10-12T20:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.614520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.614569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.614581 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.614599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.614610 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:54Z","lastTransitionTime":"2025-10-12T20:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.735290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.735330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.735343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.735360 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.735373 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:54Z","lastTransitionTime":"2025-10-12T20:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.837639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.837681 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.837695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.837748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.837762 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:54Z","lastTransitionTime":"2025-10-12T20:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.939684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.939710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.939733 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.939746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:54 crc kubenswrapper[4773]: I1012 20:25:54.939754 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:54Z","lastTransitionTime":"2025-10-12T20:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.042303 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.042337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.042346 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.042362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.042377 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.144260 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.144338 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.144353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.144373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.144387 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.246688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.246754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.246765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.246784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.246796 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.349276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.349572 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.349676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.349766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.349836 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.451863 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.452173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.452304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.452441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.452554 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.480933 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.480934 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:55 crc kubenswrapper[4773]: E1012 20:25:55.481176 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:55 crc kubenswrapper[4773]: E1012 20:25:55.481081 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.554868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.554965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.554977 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.554993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.555006 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.656825 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.656868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.656882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.656903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.656931 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.758956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.758983 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.758991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.759002 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.759010 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.861522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.862079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.862156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.862251 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.862330 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.964144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.964178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.964189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.964208 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:55 crc kubenswrapper[4773]: I1012 20:25:55.964220 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:55Z","lastTransitionTime":"2025-10-12T20:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.066021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.066055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.066063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.066098 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.066109 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.168083 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.168318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.168382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.168448 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.168505 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.270818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.270864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.270878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.270933 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.270947 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.373386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.373442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.373459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.373482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.373499 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.475171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.475213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.475222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.475235 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.475249 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.480518 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.480605 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:56 crc kubenswrapper[4773]: E1012 20:25:56.480674 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:56 crc kubenswrapper[4773]: E1012 20:25:56.480805 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.577590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.577646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.577664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.577694 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.577753 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.680595 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.680653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.680669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.680691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.680707 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.783663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.783760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.783777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.783799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.783814 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.887170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.887797 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.887820 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.887837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.887852 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.990244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.990305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.990315 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.990374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:56 crc kubenswrapper[4773]: I1012 20:25:56.990386 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:56Z","lastTransitionTime":"2025-10-12T20:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.097384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.097414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.097422 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.097435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.097444 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:57Z","lastTransitionTime":"2025-10-12T20:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.199656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.199688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.199698 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.199713 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.199749 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:57Z","lastTransitionTime":"2025-10-12T20:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.303661 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.303877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.303910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.303938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.303955 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:57Z","lastTransitionTime":"2025-10-12T20:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.405877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.405929 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.405938 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.405976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.405986 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:57Z","lastTransitionTime":"2025-10-12T20:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.480229 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.480248 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:57 crc kubenswrapper[4773]: E1012 20:25:57.480372 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:57 crc kubenswrapper[4773]: E1012 20:25:57.480512 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.508271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.508352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.508372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.508394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.508411 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:57Z","lastTransitionTime":"2025-10-12T20:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.610776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.610822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.610835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.610854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.610865 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:57Z","lastTransitionTime":"2025-10-12T20:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.714092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.714137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.714154 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.714175 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.714190 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:57Z","lastTransitionTime":"2025-10-12T20:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.817694 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.817753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.817766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.817782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.817792 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:57Z","lastTransitionTime":"2025-10-12T20:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.920563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.920631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.920655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.920686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:57 crc kubenswrapper[4773]: I1012 20:25:57.920708 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:57Z","lastTransitionTime":"2025-10-12T20:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.024163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.024228 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.024245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.024268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.024290 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.126777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.126838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.126854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.126879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.126899 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.229355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.229415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.229434 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.229462 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.229481 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.332197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.332250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.332290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.332313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.332327 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.434548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.434595 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.434610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.434630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.434646 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.480684 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.480767 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:25:58 crc kubenswrapper[4773]: E1012 20:25:58.480969 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:25:58 crc kubenswrapper[4773]: E1012 20:25:58.481089 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.537414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.537457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.537471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.537486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.537498 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.641164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.641219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.641238 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.641259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.641276 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.743813 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.743882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.743897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.743916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.743930 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.846534 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.846576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.846585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.846600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.846610 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.948707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.948775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.948792 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.948807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:58 crc kubenswrapper[4773]: I1012 20:25:58.948818 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:58Z","lastTransitionTime":"2025-10-12T20:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.051007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.051055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.051090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.051107 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.051119 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.153046 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.153093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.153105 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.153124 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.153140 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.256451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.256505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.256518 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.256545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.256558 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.358748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.358779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.358788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.358801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.358814 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.460883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.460914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.460922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.460935 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.460943 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.480179 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.480254 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:25:59 crc kubenswrapper[4773]: E1012 20:25:59.480315 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:25:59 crc kubenswrapper[4773]: E1012 20:25:59.480386 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.564041 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.564082 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.564093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.564106 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.564116 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.667132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.667178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.667189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.667208 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.667220 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.773108 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.773156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.773170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.773188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.773204 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.875402 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.875443 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.875454 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.875468 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.875477 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.977424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.977458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.977467 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.977479 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:25:59 crc kubenswrapper[4773]: I1012 20:25:59.977487 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:25:59Z","lastTransitionTime":"2025-10-12T20:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.079983 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.080012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.080020 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.080032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.080041 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:00Z","lastTransitionTime":"2025-10-12T20:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.181712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.181777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.181787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.181801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.181836 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:00Z","lastTransitionTime":"2025-10-12T20:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.284184 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.284232 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.284250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.284270 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.284283 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:00Z","lastTransitionTime":"2025-10-12T20:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.386626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.386664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.386675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.386688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.386697 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:00Z","lastTransitionTime":"2025-10-12T20:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.480965 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:00 crc kubenswrapper[4773]: E1012 20:26:00.481137 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.481411 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:00 crc kubenswrapper[4773]: E1012 20:26:00.481572 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.488499 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.488542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.488559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.488580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.488597 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:00Z","lastTransitionTime":"2025-10-12T20:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.591748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.591801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.591812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.591827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.591838 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:00Z","lastTransitionTime":"2025-10-12T20:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.694258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.694339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.694363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.694393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.694429 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:00Z","lastTransitionTime":"2025-10-12T20:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.797657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.797928 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.797944 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.797960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.797971 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:00Z","lastTransitionTime":"2025-10-12T20:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.900431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.900749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.900763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.900776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:00 crc kubenswrapper[4773]: I1012 20:26:00.900790 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:00Z","lastTransitionTime":"2025-10-12T20:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.002331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.002831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.002854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.002871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.002880 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.105596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.105677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.105698 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.105784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.105807 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.208550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.208591 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.208600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.208615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.208626 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.311238 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.311353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.311372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.311403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.311425 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.414442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.414523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.414544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.414566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.414583 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.480911 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.480980 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:01 crc kubenswrapper[4773]: E1012 20:26:01.481027 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:01 crc kubenswrapper[4773]: E1012 20:26:01.481099 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.517665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.518010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.518169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.518312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.518454 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.629915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.630235 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.630304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.630374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.630434 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.732926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.732958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.732966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.732979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.732987 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.807230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.807286 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.807302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.807324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.807342 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.842906 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.842930 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.842939 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.842951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.842959 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T20:26:01Z","lastTransitionTime":"2025-10-12T20:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.877520 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz"] Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.877861 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.881233 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.881754 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.882862 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.882891 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.921554 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rnnqj" podStartSLOduration=88.921525855 podStartE2EDuration="1m28.921525855s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:01.904243541 +0000 UTC m=+110.140542101" watchObservedRunningTime="2025-10-12 20:26:01.921525855 +0000 UTC m=+110.157824455" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.932595 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/eab07803-a185-403a-b1cb-70e6fd56a450-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.932648 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eab07803-a185-403a-b1cb-70e6fd56a450-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.932669 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eab07803-a185-403a-b1cb-70e6fd56a450-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.932705 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eab07803-a185-403a-b1cb-70e6fd56a450-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:01 crc kubenswrapper[4773]: I1012 20:26:01.932757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/eab07803-a185-403a-b1cb-70e6fd56a450-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.004109 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podStartSLOduration=90.004090208 podStartE2EDuration="1m30.004090208s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:01.988233833 +0000 UTC m=+110.224532383" watchObservedRunningTime="2025-10-12 20:26:02.004090208 +0000 UTC m=+110.240388768" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.004391 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-67c6h" podStartSLOduration=90.004383446 podStartE2EDuration="1m30.004383446s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:02.003668586 +0000 UTC m=+110.239967146" watchObservedRunningTime="2025-10-12 20:26:02.004383446 +0000 UTC m=+110.240681996" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.033735 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/eab07803-a185-403a-b1cb-70e6fd56a450-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.033792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eab07803-a185-403a-b1cb-70e6fd56a450-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.033809 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eab07803-a185-403a-b1cb-70e6fd56a450-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.033852 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eab07803-a185-403a-b1cb-70e6fd56a450-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.033879 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/eab07803-a185-403a-b1cb-70e6fd56a450-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.033898 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/eab07803-a185-403a-b1cb-70e6fd56a450-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.033931 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/eab07803-a185-403a-b1cb-70e6fd56a450-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.034953 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eab07803-a185-403a-b1cb-70e6fd56a450-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.039765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eab07803-a185-403a-b1cb-70e6fd56a450-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.040852 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jdcn7" podStartSLOduration=90.040836225 podStartE2EDuration="1m30.040836225s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:02.039283292 +0000 UTC m=+110.275581852" watchObservedRunningTime="2025-10-12 20:26:02.040836225 +0000 UTC m=+110.277134785" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.051276 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eab07803-a185-403a-b1cb-70e6fd56a450-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t99mz\" (UID: \"eab07803-a185-403a-b1cb-70e6fd56a450\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.068076 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=89.068046431 podStartE2EDuration="1m29.068046431s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:02.065075689 +0000 UTC m=+110.301374259" watchObservedRunningTime="2025-10-12 20:26:02.068046431 +0000 UTC m=+110.304344991" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.081262 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.081241672 podStartE2EDuration="1m30.081241672s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:02.080936424 +0000 UTC m=+110.317234984" watchObservedRunningTime="2025-10-12 20:26:02.081241672 +0000 UTC m=+110.317540232" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.094235 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.094206507 podStartE2EDuration="1m29.094206507s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:02.09357558 +0000 UTC m=+110.329874150" watchObservedRunningTime="2025-10-12 20:26:02.094206507 +0000 UTC m=+110.330505067" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.118082 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6sl6z" podStartSLOduration=90.118066741 podStartE2EDuration="1m30.118066741s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:02.117216478 +0000 UTC m=+110.353515038" watchObservedRunningTime="2025-10-12 20:26:02.118066741 +0000 UTC m=+110.354365301" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.141934 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.141922215 podStartE2EDuration="25.141922215s" podCreationTimestamp="2025-10-12 20:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:02.141274287 +0000 UTC m=+110.377572847" watchObservedRunningTime="2025-10-12 20:26:02.141922215 +0000 UTC m=+110.378220775" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.142274 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h94p2" podStartSLOduration=90.142269115 podStartE2EDuration="1m30.142269115s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:02.129995198 +0000 UTC m=+110.366293758" watchObservedRunningTime="2025-10-12 20:26:02.142269115 +0000 UTC m=+110.378567675" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.155165 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=63.155151258 podStartE2EDuration="1m3.155151258s" podCreationTimestamp="2025-10-12 20:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:02.154591652 +0000 UTC m=+110.390890212" watchObservedRunningTime="2025-10-12 20:26:02.155151258 +0000 UTC m=+110.391449818" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.191263 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.480198 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:02 crc kubenswrapper[4773]: I1012 20:26:02.480328 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:02 crc kubenswrapper[4773]: E1012 20:26:02.481175 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:02 crc kubenswrapper[4773]: E1012 20:26:02.481263 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:03 crc kubenswrapper[4773]: I1012 20:26:03.024399 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" event={"ID":"eab07803-a185-403a-b1cb-70e6fd56a450","Type":"ContainerStarted","Data":"3a0df27d7e44bf97fde67afb4fded65488cbeb98ebeba96ec8da315a86820bea"} Oct 12 20:26:03 crc kubenswrapper[4773]: I1012 20:26:03.024438 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" event={"ID":"eab07803-a185-403a-b1cb-70e6fd56a450","Type":"ContainerStarted","Data":"eaaf5fe1d9772712e151799462ca6311cf9cc466221d6b650020e7d0c3ecbc7d"} Oct 12 20:26:03 crc kubenswrapper[4773]: I1012 20:26:03.480088 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:03 crc kubenswrapper[4773]: E1012 20:26:03.480264 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:03 crc kubenswrapper[4773]: I1012 20:26:03.480110 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:03 crc kubenswrapper[4773]: E1012 20:26:03.480671 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:04 crc kubenswrapper[4773]: I1012 20:26:04.480640 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:04 crc kubenswrapper[4773]: I1012 20:26:04.480842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:04 crc kubenswrapper[4773]: E1012 20:26:04.481408 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:04 crc kubenswrapper[4773]: E1012 20:26:04.481637 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:04 crc kubenswrapper[4773]: I1012 20:26:04.481797 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:26:04 crc kubenswrapper[4773]: E1012 20:26:04.481992 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tzm6q_openshift-ovn-kubernetes(9bd89b89-9347-4b0d-8861-4ff26c9640b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" Oct 12 20:26:05 crc kubenswrapper[4773]: I1012 20:26:05.480932 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:05 crc kubenswrapper[4773]: I1012 20:26:05.481091 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:05 crc kubenswrapper[4773]: E1012 20:26:05.481244 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:05 crc kubenswrapper[4773]: E1012 20:26:05.481539 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:06 crc kubenswrapper[4773]: I1012 20:26:06.481100 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:06 crc kubenswrapper[4773]: E1012 20:26:06.481609 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:06 crc kubenswrapper[4773]: I1012 20:26:06.481131 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:06 crc kubenswrapper[4773]: E1012 20:26:06.481939 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:07 crc kubenswrapper[4773]: I1012 20:26:07.480733 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:07 crc kubenswrapper[4773]: E1012 20:26:07.481057 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:07 crc kubenswrapper[4773]: I1012 20:26:07.480728 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:07 crc kubenswrapper[4773]: E1012 20:26:07.481287 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:08 crc kubenswrapper[4773]: I1012 20:26:08.039128 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/1.log" Oct 12 20:26:08 crc kubenswrapper[4773]: I1012 20:26:08.040213 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/0.log" Oct 12 20:26:08 crc kubenswrapper[4773]: I1012 20:26:08.040293 4773 generic.go:334] "Generic (PLEG): container finished" podID="69ad9308-d890-40f4-9b73-fb4aad78ccd1" containerID="4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec" exitCode=1 Oct 12 20:26:08 crc kubenswrapper[4773]: I1012 20:26:08.040344 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67c6h" event={"ID":"69ad9308-d890-40f4-9b73-fb4aad78ccd1","Type":"ContainerDied","Data":"4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec"} Oct 12 20:26:08 crc kubenswrapper[4773]: I1012 20:26:08.040430 4773 scope.go:117] "RemoveContainer" containerID="249f42de3da594f79af2d58dc162426519e246b782a103f4d32992e289c04e2a" Oct 12 20:26:08 crc kubenswrapper[4773]: I1012 20:26:08.040835 4773 scope.go:117] "RemoveContainer" containerID="4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec" Oct 12 20:26:08 crc kubenswrapper[4773]: E1012 20:26:08.041079 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-67c6h_openshift-multus(69ad9308-d890-40f4-9b73-fb4aad78ccd1)\"" pod="openshift-multus/multus-67c6h" podUID="69ad9308-d890-40f4-9b73-fb4aad78ccd1" Oct 12 20:26:08 crc kubenswrapper[4773]: I1012 20:26:08.061907 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t99mz" podStartSLOduration=96.061888925 podStartE2EDuration="1m36.061888925s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:03.037395695 +0000 UTC m=+111.273694255" watchObservedRunningTime="2025-10-12 20:26:08.061888925 +0000 UTC m=+116.298187485" Oct 12 20:26:08 crc kubenswrapper[4773]: I1012 20:26:08.480407 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:08 crc kubenswrapper[4773]: I1012 20:26:08.480447 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:08 crc kubenswrapper[4773]: E1012 20:26:08.480564 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:08 crc kubenswrapper[4773]: E1012 20:26:08.480636 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:09 crc kubenswrapper[4773]: I1012 20:26:09.050844 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/1.log" Oct 12 20:26:09 crc kubenswrapper[4773]: I1012 20:26:09.480605 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:09 crc kubenswrapper[4773]: E1012 20:26:09.480733 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:09 crc kubenswrapper[4773]: I1012 20:26:09.480617 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:09 crc kubenswrapper[4773]: E1012 20:26:09.480879 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:10 crc kubenswrapper[4773]: I1012 20:26:10.480347 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:10 crc kubenswrapper[4773]: I1012 20:26:10.480527 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:10 crc kubenswrapper[4773]: E1012 20:26:10.480651 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:10 crc kubenswrapper[4773]: E1012 20:26:10.480878 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:11 crc kubenswrapper[4773]: I1012 20:26:11.480304 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:11 crc kubenswrapper[4773]: I1012 20:26:11.480362 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:11 crc kubenswrapper[4773]: E1012 20:26:11.480445 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:11 crc kubenswrapper[4773]: E1012 20:26:11.480600 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:12 crc kubenswrapper[4773]: E1012 20:26:12.416830 4773 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 12 20:26:12 crc kubenswrapper[4773]: I1012 20:26:12.480369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:12 crc kubenswrapper[4773]: I1012 20:26:12.480377 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:12 crc kubenswrapper[4773]: E1012 20:26:12.481484 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:12 crc kubenswrapper[4773]: E1012 20:26:12.481648 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:12 crc kubenswrapper[4773]: E1012 20:26:12.593811 4773 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 12 20:26:13 crc kubenswrapper[4773]: I1012 20:26:13.480427 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:13 crc kubenswrapper[4773]: E1012 20:26:13.480607 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:13 crc kubenswrapper[4773]: I1012 20:26:13.480441 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:13 crc kubenswrapper[4773]: E1012 20:26:13.480952 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:14 crc kubenswrapper[4773]: I1012 20:26:14.480255 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:14 crc kubenswrapper[4773]: E1012 20:26:14.480443 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:14 crc kubenswrapper[4773]: I1012 20:26:14.480516 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:14 crc kubenswrapper[4773]: E1012 20:26:14.480769 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:15 crc kubenswrapper[4773]: I1012 20:26:15.480103 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:15 crc kubenswrapper[4773]: I1012 20:26:15.480121 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:15 crc kubenswrapper[4773]: E1012 20:26:15.480271 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:15 crc kubenswrapper[4773]: E1012 20:26:15.480448 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:16 crc kubenswrapper[4773]: I1012 20:26:16.480837 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:16 crc kubenswrapper[4773]: E1012 20:26:16.481068 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:16 crc kubenswrapper[4773]: I1012 20:26:16.481409 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:16 crc kubenswrapper[4773]: E1012 20:26:16.481635 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:17 crc kubenswrapper[4773]: I1012 20:26:17.480409 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:17 crc kubenswrapper[4773]: E1012 20:26:17.480580 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:17 crc kubenswrapper[4773]: I1012 20:26:17.480692 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:17 crc kubenswrapper[4773]: E1012 20:26:17.480990 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:17 crc kubenswrapper[4773]: E1012 20:26:17.595417 4773 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 12 20:26:18 crc kubenswrapper[4773]: I1012 20:26:18.480565 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:18 crc kubenswrapper[4773]: I1012 20:26:18.480626 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:18 crc kubenswrapper[4773]: E1012 20:26:18.480703 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:18 crc kubenswrapper[4773]: E1012 20:26:18.481012 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:18 crc kubenswrapper[4773]: I1012 20:26:18.481288 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:26:19 crc kubenswrapper[4773]: I1012 20:26:19.082009 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/3.log" Oct 12 20:26:19 crc kubenswrapper[4773]: I1012 20:26:19.085047 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerStarted","Data":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} Oct 12 20:26:19 crc kubenswrapper[4773]: I1012 20:26:19.085511 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:26:19 crc kubenswrapper[4773]: I1012 20:26:19.117906 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podStartSLOduration=106.117888049 podStartE2EDuration="1m46.117888049s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:19.116445999 +0000 UTC m=+127.352744559" watchObservedRunningTime="2025-10-12 20:26:19.117888049 +0000 UTC m=+127.354186619" Oct 12 20:26:19 crc kubenswrapper[4773]: I1012 20:26:19.313034 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6sbfz"] Oct 12 20:26:19 crc kubenswrapper[4773]: I1012 20:26:19.313155 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:19 crc kubenswrapper[4773]: E1012 20:26:19.313258 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:19 crc kubenswrapper[4773]: I1012 20:26:19.480437 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:19 crc kubenswrapper[4773]: E1012 20:26:19.480561 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:20 crc kubenswrapper[4773]: I1012 20:26:20.480392 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:20 crc kubenswrapper[4773]: I1012 20:26:20.480502 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:20 crc kubenswrapper[4773]: E1012 20:26:20.480704 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:20 crc kubenswrapper[4773]: I1012 20:26:20.480840 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:20 crc kubenswrapper[4773]: E1012 20:26:20.480926 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:20 crc kubenswrapper[4773]: E1012 20:26:20.481059 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:21 crc kubenswrapper[4773]: I1012 20:26:21.480231 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:21 crc kubenswrapper[4773]: E1012 20:26:21.480621 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:21 crc kubenswrapper[4773]: I1012 20:26:21.480816 4773 scope.go:117] "RemoveContainer" containerID="4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec" Oct 12 20:26:22 crc kubenswrapper[4773]: I1012 20:26:22.098819 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/1.log" Oct 12 20:26:22 crc kubenswrapper[4773]: I1012 20:26:22.098894 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67c6h" event={"ID":"69ad9308-d890-40f4-9b73-fb4aad78ccd1","Type":"ContainerStarted","Data":"47a9e1c3c8960606e8d7f5b84a070fd4d124286a85941d89f2b1e1c90998c126"} Oct 12 20:26:22 crc kubenswrapper[4773]: I1012 20:26:22.480381 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:22 crc kubenswrapper[4773]: I1012 20:26:22.480423 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:22 crc kubenswrapper[4773]: I1012 20:26:22.480510 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:22 crc kubenswrapper[4773]: E1012 20:26:22.482738 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:22 crc kubenswrapper[4773]: E1012 20:26:22.483148 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:22 crc kubenswrapper[4773]: E1012 20:26:22.483292 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:22 crc kubenswrapper[4773]: E1012 20:26:22.595971 4773 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 12 20:26:23 crc kubenswrapper[4773]: I1012 20:26:23.480682 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:23 crc kubenswrapper[4773]: E1012 20:26:23.480953 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:24 crc kubenswrapper[4773]: I1012 20:26:24.480908 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:24 crc kubenswrapper[4773]: I1012 20:26:24.480961 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:24 crc kubenswrapper[4773]: I1012 20:26:24.481025 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:24 crc kubenswrapper[4773]: E1012 20:26:24.481104 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:24 crc kubenswrapper[4773]: E1012 20:26:24.481313 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:24 crc kubenswrapper[4773]: E1012 20:26:24.481466 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:25 crc kubenswrapper[4773]: I1012 20:26:25.064241 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:26:25 crc kubenswrapper[4773]: I1012 20:26:25.480445 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:25 crc kubenswrapper[4773]: E1012 20:26:25.480625 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:26 crc kubenswrapper[4773]: I1012 20:26:26.480820 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:26 crc kubenswrapper[4773]: I1012 20:26:26.480858 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:26 crc kubenswrapper[4773]: E1012 20:26:26.480942 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 20:26:26 crc kubenswrapper[4773]: I1012 20:26:26.480978 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:26 crc kubenswrapper[4773]: E1012 20:26:26.481052 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 20:26:26 crc kubenswrapper[4773]: E1012 20:26:26.481124 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6sbfz" podUID="a0e0fa58-fcd9-4002-a975-a98fcba0f364" Oct 12 20:26:27 crc kubenswrapper[4773]: I1012 20:26:27.480039 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:27 crc kubenswrapper[4773]: E1012 20:26:27.480227 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 20:26:28 crc kubenswrapper[4773]: I1012 20:26:28.480240 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:28 crc kubenswrapper[4773]: I1012 20:26:28.480245 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:28 crc kubenswrapper[4773]: I1012 20:26:28.480273 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:28 crc kubenswrapper[4773]: I1012 20:26:28.484168 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 12 20:26:28 crc kubenswrapper[4773]: I1012 20:26:28.484421 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 12 20:26:28 crc kubenswrapper[4773]: I1012 20:26:28.485108 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 12 20:26:28 crc kubenswrapper[4773]: I1012 20:26:28.485205 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 12 20:26:28 crc kubenswrapper[4773]: I1012 20:26:28.485434 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 12 20:26:28 crc kubenswrapper[4773]: I1012 20:26:28.487653 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 12 20:26:29 crc kubenswrapper[4773]: I1012 20:26:29.481073 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.685848 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.736765 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.737481 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.737923 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.738163 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.739761 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-gfc75"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.740210 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.742586 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.742982 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.743323 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.743775 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.747077 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.747262 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.747887 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.748249 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.748405 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kx756"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.749278 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.750663 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nmsrw"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.751452 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.752853 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.753260 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.753354 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.752919 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.752919 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.752991 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.753608 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.753637 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.754065 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.755558 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.755874 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.755903 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.756092 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.756227 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.756746 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.757016 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.757318 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.757508 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.757751 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.757760 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.767485 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.767560 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.767829 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.767871 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.767881 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.768047 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.768201 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.768255 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.771044 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wc5dc"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.771676 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.772600 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.775066 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-77jxw"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.775301 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.775560 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cpc54"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.776001 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.776514 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.781252 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7jq2n"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.783247 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.783693 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-m6mmg"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.784945 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-m6mmg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.789048 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.789158 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.791049 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.791996 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.793232 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.798787 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-n6tdw"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.821261 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.821702 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.822174 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.822280 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.822654 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.822954 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.825122 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.826633 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.829295 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.829704 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.831066 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.831799 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.832161 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.832933 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.833942 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.834137 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.834214 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.834287 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.834507 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.835101 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.835118 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.835271 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.835820 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.835844 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.835994 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gfc75"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.836025 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.836372 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.836707 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.836900 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.836967 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.837084 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.838849 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.841098 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.847960 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.848163 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.848778 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.849051 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.849161 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.849251 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.849355 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.849478 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.849542 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.849629 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.853194 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.872158 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878200 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069f92ef-1b80-414b-a631-08b572fdd419-config\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e72fa2-1994-43c0-940c-bf63a893b4c9-config\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878257 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xblzk\" (UniqueName: \"kubernetes.io/projected/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-kube-api-access-xblzk\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878276 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1f75e12-58b5-40ee-8254-70653c1b78bf-audit-dir\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878291 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6zv\" (UniqueName: \"kubernetes.io/projected/ad343a90-adad-46cc-b828-93cda758fd2b-kube-api-access-tl6zv\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878308 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878324 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1f75e12-58b5-40ee-8254-70653c1b78bf-serving-cert\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878352 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-audit\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878368 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwsx8\" (UniqueName: \"kubernetes.io/projected/b95bacee-8e7a-4f84-a635-5fe22d2a700e-kube-api-access-gwsx8\") pod \"openshift-config-operator-7777fb866f-zpkkw\" (UID: \"b95bacee-8e7a-4f84-a635-5fe22d2a700e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878384 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-encryption-config\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878397 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-config\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878411 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878426 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20350140-2938-4a4b-b17c-6532ba45ac98-serving-cert\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878440 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f85w\" (UniqueName: \"kubernetes.io/projected/bf381381-f5d3-4217-8a9c-cf527e2c6c65-kube-api-access-9f85w\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878454 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-dir\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878469 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878511 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b95bacee-8e7a-4f84-a635-5fe22d2a700e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zpkkw\" (UID: \"b95bacee-8e7a-4f84-a635-5fe22d2a700e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878526 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20350140-2938-4a4b-b17c-6532ba45ac98-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878541 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e5e72fa2-1994-43c0-940c-bf63a893b4c9-machine-approver-tls\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878572 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6gf2\" (UniqueName: \"kubernetes.io/projected/6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba-kube-api-access-x6gf2\") pod \"cluster-samples-operator-665b6dd947-snnz9\" (UID: \"6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878589 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7htq\" (UniqueName: \"kubernetes.io/projected/cd934a31-cbf9-4b33-831c-2622adbe4f76-kube-api-access-n7htq\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878605 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b95bacee-8e7a-4f84-a635-5fe22d2a700e-serving-cert\") pod \"openshift-config-operator-7777fb866f-zpkkw\" (UID: \"b95bacee-8e7a-4f84-a635-5fe22d2a700e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878619 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-config\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878635 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-client-ca\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878648 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d1f75e12-58b5-40ee-8254-70653c1b78bf-node-pullsecrets\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-etcd-serving-ca\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878679 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtbdm\" (UniqueName: \"kubernetes.io/projected/b9ea34ec-5e6f-40ad-b636-437335b1025a-kube-api-access-xtbdm\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878695 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878728 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d1f75e12-58b5-40ee-8254-70653c1b78bf-encryption-config\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878744 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9ea34ec-5e6f-40ad-b636-437335b1025a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878758 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9ea34ec-5e6f-40ad-b636-437335b1025a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878772 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-config\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878785 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhm79\" (UniqueName: \"kubernetes.io/projected/069f92ef-1b80-414b-a631-08b572fdd419-kube-api-access-qhm79\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878810 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf381381-f5d3-4217-8a9c-cf527e2c6c65-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878823 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d1f75e12-58b5-40ee-8254-70653c1b78bf-etcd-client\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878837 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67l5h\" (UniqueName: \"kubernetes.io/projected/8161c743-a5fd-49bc-ae70-3b249ec900dc-kube-api-access-67l5h\") pod \"openshift-apiserver-operator-796bbdcf4f-fvdbd\" (UID: \"8161c743-a5fd-49bc-ae70-3b249ec900dc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878873 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-console-config\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878886 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-oauth-serving-cert\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878902 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r75vd\" (UniqueName: \"kubernetes.io/projected/55db89f5-582b-4326-8150-4e51c83f0706-kube-api-access-r75vd\") pod \"downloads-7954f5f757-m6mmg\" (UID: \"55db89f5-582b-4326-8150-4e51c83f0706\") " pod="openshift-console/downloads-7954f5f757-m6mmg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878918 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ea34ec-5e6f-40ad-b636-437335b1025a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878934 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5e72fa2-1994-43c0-940c-bf63a893b4c9-auth-proxy-config\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878949 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-etcd-client\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf381381-f5d3-4217-8a9c-cf527e2c6c65-config\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878977 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-oauth-config\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.878993 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/069f92ef-1b80-414b-a631-08b572fdd419-trusted-ca\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879008 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-snnz9\" (UID: \"6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879023 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879039 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-audit-dir\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879053 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-service-ca\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879067 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879082 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpjz\" (UniqueName: \"kubernetes.io/projected/d1f75e12-58b5-40ee-8254-70653c1b78bf-kube-api-access-twpjz\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879098 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjczv\" (UniqueName: \"kubernetes.io/projected/e5e72fa2-1994-43c0-940c-bf63a893b4c9-kube-api-access-zjczv\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-audit-policies\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879125 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8161c743-a5fd-49bc-ae70-3b249ec900dc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fvdbd\" (UID: \"8161c743-a5fd-49bc-ae70-3b249ec900dc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879139 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-client-ca\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879153 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20350140-2938-4a4b-b17c-6532ba45ac98-service-ca-bundle\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879167 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879182 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-serving-cert\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879198 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8161c743-a5fd-49bc-ae70-3b249ec900dc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fvdbd\" (UID: \"8161c743-a5fd-49bc-ae70-3b249ec900dc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879213 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf5gr\" (UniqueName: \"kubernetes.io/projected/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-kube-api-access-mf5gr\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-serving-cert\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879251 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bf381381-f5d3-4217-8a9c-cf527e2c6c65-images\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879264 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-policies\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879277 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879292 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879306 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-trusted-ca-bundle\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879327 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd934a31-cbf9-4b33-831c-2622adbe4f76-serving-cert\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20350140-2938-4a4b-b17c-6532ba45ac98-config\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879355 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x5jq\" (UniqueName: \"kubernetes.io/projected/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-kube-api-access-9x5jq\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879369 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-image-import-ca\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879384 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879398 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-serving-cert\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879412 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-trusted-ca-bundle\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879433 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r5z7\" (UniqueName: \"kubernetes.io/projected/20350140-2938-4a4b-b17c-6532ba45ac98-kube-api-access-2r5z7\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.879448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069f92ef-1b80-414b-a631-08b572fdd419-serving-cert\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.884050 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.885841 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.885906 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886045 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886129 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886164 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886249 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886293 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886063 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886385 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886417 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886550 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.886607 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.887864 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.888481 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.894457 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.913068 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-77jxw"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.913125 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4j5xq"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.914153 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hrdqp"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.917200 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.917934 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.918358 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.918764 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.919116 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.919149 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.919313 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.919348 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.919492 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.923867 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.924418 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.928150 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.929094 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nmsrw"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.934797 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.941919 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.965918 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.971137 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-55vqv"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.971492 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.971756 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.971883 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.974912 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.975642 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gp2tr"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.976101 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.976333 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.978279 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.978419 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.978431 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.978517 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.978545 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.979103 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.979988 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6zv\" (UniqueName: \"kubernetes.io/projected/ad343a90-adad-46cc-b828-93cda758fd2b-kube-api-access-tl6zv\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980021 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980043 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980064 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1f75e12-58b5-40ee-8254-70653c1b78bf-serving-cert\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980085 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwsx8\" (UniqueName: \"kubernetes.io/projected/b95bacee-8e7a-4f84-a635-5fe22d2a700e-kube-api-access-gwsx8\") pod \"openshift-config-operator-7777fb866f-zpkkw\" (UID: \"b95bacee-8e7a-4f84-a635-5fe22d2a700e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-encryption-config\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980124 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-config\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-audit\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980161 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20350140-2938-4a4b-b17c-6532ba45ac98-serving-cert\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f85w\" (UniqueName: \"kubernetes.io/projected/bf381381-f5d3-4217-8a9c-cf527e2c6c65-kube-api-access-9f85w\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980197 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-dir\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980215 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980265 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b95bacee-8e7a-4f84-a635-5fe22d2a700e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zpkkw\" (UID: \"b95bacee-8e7a-4f84-a635-5fe22d2a700e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20350140-2938-4a4b-b17c-6532ba45ac98-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980331 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e5e72fa2-1994-43c0-940c-bf63a893b4c9-machine-approver-tls\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980367 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980395 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6gf2\" (UniqueName: \"kubernetes.io/projected/6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba-kube-api-access-x6gf2\") pod \"cluster-samples-operator-665b6dd947-snnz9\" (UID: \"6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7htq\" (UniqueName: \"kubernetes.io/projected/cd934a31-cbf9-4b33-831c-2622adbe4f76-kube-api-access-n7htq\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980444 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b95bacee-8e7a-4f84-a635-5fe22d2a700e-serving-cert\") pod \"openshift-config-operator-7777fb866f-zpkkw\" (UID: \"b95bacee-8e7a-4f84-a635-5fe22d2a700e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980470 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-config\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980491 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-client-ca\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980513 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtbdm\" (UniqueName: \"kubernetes.io/projected/b9ea34ec-5e6f-40ad-b636-437335b1025a-kube-api-access-xtbdm\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980539 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d1f75e12-58b5-40ee-8254-70653c1b78bf-node-pullsecrets\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980563 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-etcd-serving-ca\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980586 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980610 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d1f75e12-58b5-40ee-8254-70653c1b78bf-encryption-config\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980631 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9ea34ec-5e6f-40ad-b636-437335b1025a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980653 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9ea34ec-5e6f-40ad-b636-437335b1025a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980674 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-config\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhm79\" (UniqueName: \"kubernetes.io/projected/069f92ef-1b80-414b-a631-08b572fdd419-kube-api-access-qhm79\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf381381-f5d3-4217-8a9c-cf527e2c6c65-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980763 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d1f75e12-58b5-40ee-8254-70653c1b78bf-etcd-client\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980785 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67l5h\" (UniqueName: \"kubernetes.io/projected/8161c743-a5fd-49bc-ae70-3b249ec900dc-kube-api-access-67l5h\") pod \"openshift-apiserver-operator-796bbdcf4f-fvdbd\" (UID: \"8161c743-a5fd-49bc-ae70-3b249ec900dc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980805 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-console-config\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980834 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-oauth-serving-cert\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980859 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r75vd\" (UniqueName: \"kubernetes.io/projected/55db89f5-582b-4326-8150-4e51c83f0706-kube-api-access-r75vd\") pod \"downloads-7954f5f757-m6mmg\" (UID: \"55db89f5-582b-4326-8150-4e51c83f0706\") " pod="openshift-console/downloads-7954f5f757-m6mmg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980884 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5e72fa2-1994-43c0-940c-bf63a893b4c9-auth-proxy-config\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980935 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-etcd-client\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980958 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf381381-f5d3-4217-8a9c-cf527e2c6c65-config\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.980980 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-oauth-config\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ea34ec-5e6f-40ad-b636-437335b1025a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981026 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-snnz9\" (UID: \"6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981047 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/069f92ef-1b80-414b-a631-08b572fdd419-trusted-ca\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981068 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-audit-dir\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-service-ca\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981114 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981137 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjczv\" (UniqueName: \"kubernetes.io/projected/e5e72fa2-1994-43c0-940c-bf63a893b4c9-kube-api-access-zjczv\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981158 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-audit-policies\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8161c743-a5fd-49bc-ae70-3b249ec900dc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fvdbd\" (UID: \"8161c743-a5fd-49bc-ae70-3b249ec900dc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981200 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981223 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twpjz\" (UniqueName: \"kubernetes.io/projected/d1f75e12-58b5-40ee-8254-70653c1b78bf-kube-api-access-twpjz\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981247 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20350140-2938-4a4b-b17c-6532ba45ac98-service-ca-bundle\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981270 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981294 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-client-ca\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981315 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-serving-cert\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981335 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8161c743-a5fd-49bc-ae70-3b249ec900dc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fvdbd\" (UID: \"8161c743-a5fd-49bc-ae70-3b249ec900dc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981368 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-serving-cert\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981390 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bf381381-f5d3-4217-8a9c-cf527e2c6c65-images\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981411 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-policies\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981432 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981455 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf5gr\" (UniqueName: \"kubernetes.io/projected/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-kube-api-access-mf5gr\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981479 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981502 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-trusted-ca-bundle\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981539 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd934a31-cbf9-4b33-831c-2622adbe4f76-serving-cert\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.981564 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20350140-2938-4a4b-b17c-6532ba45ac98-config\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.982754 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.983242 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.983377 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.987478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e5e72fa2-1994-43c0-940c-bf63a893b4c9-machine-approver-tls\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.988386 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.988470 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d1f75e12-58b5-40ee-8254-70653c1b78bf-node-pullsecrets\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.988993 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.989748 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.991763 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.994188 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-etcd-serving-ca\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.994433 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-console-config\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.994948 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-oauth-serving-cert\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995093 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-m6mmg"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995460 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9ea34ec-5e6f-40ad-b636-437335b1025a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995510 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x5jq\" (UniqueName: \"kubernetes.io/projected/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-kube-api-access-9x5jq\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995540 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995562 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-image-import-ca\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r5z7\" (UniqueName: \"kubernetes.io/projected/20350140-2938-4a4b-b17c-6532ba45ac98-kube-api-access-2r5z7\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-serving-cert\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995618 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-trusted-ca-bundle\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995644 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069f92ef-1b80-414b-a631-08b572fdd419-serving-cert\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069f92ef-1b80-414b-a631-08b572fdd419-config\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995686 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xblzk\" (UniqueName: \"kubernetes.io/projected/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-kube-api-access-xblzk\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e72fa2-1994-43c0-940c-bf63a893b4c9-config\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995750 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1f75e12-58b5-40ee-8254-70653c1b78bf-audit-dir\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.995852 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1f75e12-58b5-40ee-8254-70653c1b78bf-audit-dir\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.996840 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.997277 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7jq2n"] Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.997311 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-config\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.997614 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20350140-2938-4a4b-b17c-6532ba45ac98-config\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:32 crc kubenswrapper[4773]: I1012 20:26:32.998383 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-trusted-ca-bundle\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:32.999054 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-image-import-ca\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.001016 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-client-ca\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.002391 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.002588 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-dir\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.002959 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-config\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.003076 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-config\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.003503 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-audit\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.005077 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b95bacee-8e7a-4f84-a635-5fe22d2a700e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zpkkw\" (UID: \"b95bacee-8e7a-4f84-a635-5fe22d2a700e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.005881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20350140-2938-4a4b-b17c-6532ba45ac98-service-ca-bundle\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.006310 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.006844 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-client-ca\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.008658 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-n6tdw"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.009765 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.010482 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lgcxv"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.011246 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.013055 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.014468 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.017538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e72fa2-1994-43c0-940c-bf63a893b4c9-config\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.018507 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/069f92ef-1b80-414b-a631-08b572fdd419-config\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.019235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-serving-cert\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.019678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/069f92ef-1b80-414b-a631-08b572fdd419-trusted-ca\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.019800 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-encryption-config\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.022089 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.022687 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8161c743-a5fd-49bc-ae70-3b249ec900dc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fvdbd\" (UID: \"8161c743-a5fd-49bc-ae70-3b249ec900dc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.024103 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.024704 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-audit-dir\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.025030 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5e72fa2-1994-43c0-940c-bf63a893b4c9-auth-proxy-config\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.025205 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1f75e12-58b5-40ee-8254-70653c1b78bf-trusted-ca-bundle\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.025272 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-service-ca\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.026017 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20350140-2938-4a4b-b17c-6532ba45ac98-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.026278 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.026696 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bf381381-f5d3-4217-8a9c-cf527e2c6c65-images\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.027010 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20350140-2938-4a4b-b17c-6532ba45ac98-serving-cert\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.027236 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-policies\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.027670 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf381381-f5d3-4217-8a9c-cf527e2c6c65-config\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.027942 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.028055 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.028577 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.028867 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.029189 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.028061 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/069f92ef-1b80-414b-a631-08b572fdd419-serving-cert\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.029461 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.029597 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.029940 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.031147 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-serving-cert\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.031241 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.031422 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.031547 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.031640 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.032311 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.033472 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-74w74"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.035222 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.040204 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-serving-cert\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.040587 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-audit-policies\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.040893 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-etcd-client\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.041225 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.042002 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.066306 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.066473 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.092325 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf381381-f5d3-4217-8a9c-cf527e2c6c65-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.093632 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d1f75e12-58b5-40ee-8254-70653c1b78bf-etcd-client\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.093778 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd934a31-cbf9-4b33-831c-2622adbe4f76-serving-cert\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.093898 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.093948 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.094159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.094535 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1f75e12-58b5-40ee-8254-70653c1b78bf-serving-cert\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.094643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d1f75e12-58b5-40ee-8254-70653c1b78bf-encryption-config\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.094902 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.094914 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.095425 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.095452 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-snnz9\" (UID: \"6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.095567 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-oauth-config\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.095696 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b95bacee-8e7a-4f84-a635-5fe22d2a700e-serving-cert\") pod \"openshift-config-operator-7777fb866f-zpkkw\" (UID: \"b95bacee-8e7a-4f84-a635-5fe22d2a700e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.097814 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8161c743-a5fd-49bc-ae70-3b249ec900dc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fvdbd\" (UID: \"8161c743-a5fd-49bc-ae70-3b249ec900dc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.097862 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jbzcl"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.100118 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ea34ec-5e6f-40ad-b636-437335b1025a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.104204 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nx5ql"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.104890 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.105634 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.112052 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.114392 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.114902 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.121416 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.122658 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.123256 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-86ftc"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.123816 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.124154 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.124322 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.128022 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.128469 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.128916 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hxq5p"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.131765 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.135873 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4j5xq"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.137069 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.139074 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lgcxv"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.140018 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zdjkd"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.141830 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cpc54"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.141945 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zdjkd" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.142111 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.145130 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.145224 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.145271 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-55vqv"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.146738 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wc5dc"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.147319 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.148694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.150597 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.150682 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.152800 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.153850 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.157785 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-86ftc"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.157820 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jbzcl"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.157831 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.159779 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.159801 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-84fdf"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.160969 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.161640 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.164159 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.167061 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gp2tr"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.167751 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-74w74"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.168516 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.171620 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.171675 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.171687 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nx5ql"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.174461 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zdjkd"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.174527 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.175685 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hxq5p"] Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.181102 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.200421 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.221233 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.239864 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.259728 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.280064 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.300561 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.320322 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.339901 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.359977 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.379587 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.400137 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.420231 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.440860 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.460466 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.479756 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.500882 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.540306 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.560149 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.580103 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.600914 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.620869 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.640512 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.660977 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.682414 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.701054 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.721000 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.740471 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.780584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6zv\" (UniqueName: \"kubernetes.io/projected/ad343a90-adad-46cc-b828-93cda758fd2b-kube-api-access-tl6zv\") pod \"console-f9d7485db-gfc75\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.799339 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6gf2\" (UniqueName: \"kubernetes.io/projected/6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba-kube-api-access-x6gf2\") pod \"cluster-samples-operator-665b6dd947-snnz9\" (UID: \"6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.824748 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7htq\" (UniqueName: \"kubernetes.io/projected/cd934a31-cbf9-4b33-831c-2622adbe4f76-kube-api-access-n7htq\") pod \"controller-manager-879f6c89f-wc5dc\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.836876 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67l5h\" (UniqueName: \"kubernetes.io/projected/8161c743-a5fd-49bc-ae70-3b249ec900dc-kube-api-access-67l5h\") pod \"openshift-apiserver-operator-796bbdcf4f-fvdbd\" (UID: \"8161c743-a5fd-49bc-ae70-3b249ec900dc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.857358 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r75vd\" (UniqueName: \"kubernetes.io/projected/55db89f5-582b-4326-8150-4e51c83f0706-kube-api-access-r75vd\") pod \"downloads-7954f5f757-m6mmg\" (UID: \"55db89f5-582b-4326-8150-4e51c83f0706\") " pod="openshift-console/downloads-7954f5f757-m6mmg" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.877484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9ea34ec-5e6f-40ad-b636-437335b1025a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.888809 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.902831 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x5jq\" (UniqueName: \"kubernetes.io/projected/1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6-kube-api-access-9x5jq\") pod \"apiserver-7bbb656c7d-rh842\" (UID: \"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.920181 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhm79\" (UniqueName: \"kubernetes.io/projected/069f92ef-1b80-414b-a631-08b572fdd419-kube-api-access-qhm79\") pod \"console-operator-58897d9998-cpc54\" (UID: \"069f92ef-1b80-414b-a631-08b572fdd419\") " pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.937772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r5z7\" (UniqueName: \"kubernetes.io/projected/20350140-2938-4a4b-b17c-6532ba45ac98-kube-api-access-2r5z7\") pod \"authentication-operator-69f744f599-77jxw\" (UID: \"20350140-2938-4a4b-b17c-6532ba45ac98\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.959900 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwsx8\" (UniqueName: \"kubernetes.io/projected/b95bacee-8e7a-4f84-a635-5fe22d2a700e-kube-api-access-gwsx8\") pod \"openshift-config-operator-7777fb866f-zpkkw\" (UID: \"b95bacee-8e7a-4f84-a635-5fe22d2a700e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:33 crc kubenswrapper[4773]: I1012 20:26:33.989617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtbdm\" (UniqueName: \"kubernetes.io/projected/b9ea34ec-5e6f-40ad-b636-437335b1025a-kube-api-access-xtbdm\") pod \"cluster-image-registry-operator-dc59b4c8b-q26mg\" (UID: \"b9ea34ec-5e6f-40ad-b636-437335b1025a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.016156 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.016644 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f85w\" (UniqueName: \"kubernetes.io/projected/bf381381-f5d3-4217-8a9c-cf527e2c6c65-kube-api-access-9f85w\") pod \"machine-api-operator-5694c8668f-nmsrw\" (UID: \"bf381381-f5d3-4217-8a9c-cf527e2c6c65\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.019217 4773 request.go:700] Waited for 1.007613369s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-admission-controller-secret&limit=500&resourceVersion=0 Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.021442 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.029037 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.030162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpjz\" (UniqueName: \"kubernetes.io/projected/d1f75e12-58b5-40ee-8254-70653c1b78bf-kube-api-access-twpjz\") pod \"apiserver-76f77b778f-n6tdw\" (UID: \"d1f75e12-58b5-40ee-8254-70653c1b78bf\") " pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.037024 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.040168 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.047489 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.055230 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.060220 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.068126 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-m6mmg" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.075469 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.092991 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.093114 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.093434 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.099612 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xblzk\" (UniqueName: \"kubernetes.io/projected/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-kube-api-access-xblzk\") pod \"route-controller-manager-6576b87f9c-mpct2\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.120727 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.136870 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf5gr\" (UniqueName: \"kubernetes.io/projected/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-kube-api-access-mf5gr\") pod \"oauth-openshift-558db77b4-7jq2n\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.151081 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.161436 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.176147 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjczv\" (UniqueName: \"kubernetes.io/projected/e5e72fa2-1994-43c0-940c-bf63a893b4c9-kube-api-access-zjczv\") pod \"machine-approver-56656f9798-kx756\" (UID: \"e5e72fa2-1994-43c0-940c-bf63a893b4c9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.184386 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.200757 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.222041 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.250954 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.261458 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.271911 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.284518 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.306165 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.320215 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.347028 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.359195 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wc5dc"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.359414 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.360845 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.380941 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 12 20:26:34 crc kubenswrapper[4773]: W1012 20:26:34.391336 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd934a31_cbf9_4b33_831c_2622adbe4f76.slice/crio-9c68021a06d3a8605ef7471383cebb1ed3502738b10df5f23bd68ea331a69ccf WatchSource:0}: Error finding container 9c68021a06d3a8605ef7471383cebb1ed3502738b10df5f23bd68ea331a69ccf: Status 404 returned error can't find the container with id 9c68021a06d3a8605ef7471383cebb1ed3502738b10df5f23bd68ea331a69ccf Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.400685 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.424997 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.425014 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gfc75"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.431099 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.443911 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.446683 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.461419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.480089 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.502132 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.523595 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.540629 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.540789 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cpc54"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.551623 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.568381 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.581914 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 12 20:26:34 crc kubenswrapper[4773]: W1012 20:26:34.599330 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8161c743_a5fd_49bc_ae70_3b249ec900dc.slice/crio-c3cc5b4bbc2364380de36f76b2435357a6d9995bacb30c9f6335ce369f426b6e WatchSource:0}: Error finding container c3cc5b4bbc2364380de36f76b2435357a6d9995bacb30c9f6335ce369f426b6e: Status 404 returned error can't find the container with id c3cc5b4bbc2364380de36f76b2435357a6d9995bacb30c9f6335ce369f426b6e Oct 12 20:26:34 crc kubenswrapper[4773]: W1012 20:26:34.599925 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod069f92ef_1b80_414b_a631_08b572fdd419.slice/crio-22e35f09479b10f9f9d5bc9e7480dcf439cf120543f2b71d82e3688e5efbed81 WatchSource:0}: Error finding container 22e35f09479b10f9f9d5bc9e7480dcf439cf120543f2b71d82e3688e5efbed81: Status 404 returned error can't find the container with id 22e35f09479b10f9f9d5bc9e7480dcf439cf120543f2b71d82e3688e5efbed81 Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.600820 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.621382 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.643313 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.661071 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.682962 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.708379 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.718625 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.721963 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.727752 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-77jxw"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.730016 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-m6mmg"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.734204 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg"] Oct 12 20:26:34 crc kubenswrapper[4773]: W1012 20:26:34.752994 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d6ec546_7ecf_4ec4_b44f_eb2dd5e9a1c6.slice/crio-fb4b5b1ac42aa2dc6e54f9960e814b7ffe30ce8fabc8e8401993351be816b1ae WatchSource:0}: Error finding container fb4b5b1ac42aa2dc6e54f9960e814b7ffe30ce8fabc8e8401993351be816b1ae: Status 404 returned error can't find the container with id fb4b5b1ac42aa2dc6e54f9960e814b7ffe30ce8fabc8e8401993351be816b1ae Oct 12 20:26:34 crc kubenswrapper[4773]: W1012 20:26:34.758039 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20350140_2938_4a4b_b17c_6532ba45ac98.slice/crio-0feffb18cc16c1cfa479080cc75485b36b300c7df268aaba563db8850db8aa94 WatchSource:0}: Error finding container 0feffb18cc16c1cfa479080cc75485b36b300c7df268aaba563db8850db8aa94: Status 404 returned error can't find the container with id 0feffb18cc16c1cfa479080cc75485b36b300c7df268aaba563db8850db8aa94 Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.760760 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.780898 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.800301 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.820518 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.838697 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.840488 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.853423 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nmsrw"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.869232 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 12 20:26:34 crc kubenswrapper[4773]: W1012 20:26:34.874614 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb95bacee_8e7a_4f84_a635_5fe22d2a700e.slice/crio-6c3eda6b56c1395945e6915569826e50b56a5df90c47d60a3f8f6dbfe86f5684 WatchSource:0}: Error finding container 6c3eda6b56c1395945e6915569826e50b56a5df90c47d60a3f8f6dbfe86f5684: Status 404 returned error can't find the container with id 6c3eda6b56c1395945e6915569826e50b56a5df90c47d60a3f8f6dbfe86f5684 Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.880395 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.894628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7jq2n"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.898095 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.900571 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 20:26:34 crc kubenswrapper[4773]: W1012 20:26:34.911952 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be19f86_17ad_4697_9ff0_d5b7ee06a60d.slice/crio-c74e8c5b41f1311eb7b129f18622e527830f9cea4aae009aeb53010a5dd12311 WatchSource:0}: Error finding container c74e8c5b41f1311eb7b129f18622e527830f9cea4aae009aeb53010a5dd12311: Status 404 returned error can't find the container with id c74e8c5b41f1311eb7b129f18622e527830f9cea4aae009aeb53010a5dd12311 Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.920496 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.925503 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-n6tdw"] Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.940086 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.961534 4773 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 12 20:26:34 crc kubenswrapper[4773]: I1012 20:26:34.980652 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.000703 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.020151 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.038661 4773 request.go:700] Waited for 1.896366041s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.040264 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.061623 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.081125 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.101057 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.124646 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.160274 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" event={"ID":"20350140-2938-4a4b-b17c-6532ba45ac98","Type":"ContainerStarted","Data":"6e24689478b08a13d4766bd12e0e93b9995acbe7abd369c82148a2bf3abc6edd"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.160310 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" event={"ID":"20350140-2938-4a4b-b17c-6532ba45ac98","Type":"ContainerStarted","Data":"0feffb18cc16c1cfa479080cc75485b36b300c7df268aaba563db8850db8aa94"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.161686 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" event={"ID":"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6","Type":"ContainerStarted","Data":"fb4b5b1ac42aa2dc6e54f9960e814b7ffe30ce8fabc8e8401993351be816b1ae"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.162745 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gfc75" event={"ID":"ad343a90-adad-46cc-b828-93cda758fd2b","Type":"ContainerStarted","Data":"0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.162763 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gfc75" event={"ID":"ad343a90-adad-46cc-b828-93cda758fd2b","Type":"ContainerStarted","Data":"12b88f31266761f23498339f1187d21b723dae12d7caff904f91b33b04d4b6ca"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.164832 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" event={"ID":"b9ea34ec-5e6f-40ad-b636-437335b1025a","Type":"ContainerStarted","Data":"0912d0fe1c23b58fe1990fc9f357de7abdf79db343495bf543041a5835b3d612"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.164859 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" event={"ID":"b9ea34ec-5e6f-40ad-b636-437335b1025a","Type":"ContainerStarted","Data":"ba90b077df099d490ced89a8d7fb6126544550b4ed37241012131aaace5fcd29"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.171585 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" event={"ID":"8161c743-a5fd-49bc-ae70-3b249ec900dc","Type":"ContainerStarted","Data":"abb67ca3be73d39872e840fdb68045bbca4f0d1f02c79c025746f72f7591b360"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.171626 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" event={"ID":"8161c743-a5fd-49bc-ae70-3b249ec900dc","Type":"ContainerStarted","Data":"c3cc5b4bbc2364380de36f76b2435357a6d9995bacb30c9f6335ce369f426b6e"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.186822 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" event={"ID":"cd934a31-cbf9-4b33-831c-2622adbe4f76","Type":"ContainerStarted","Data":"2d6bec76dd71d7441b8fc6418295b20fb8a5a7404e88aebc0bb2792e964d73d2"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.186863 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" event={"ID":"cd934a31-cbf9-4b33-831c-2622adbe4f76","Type":"ContainerStarted","Data":"9c68021a06d3a8605ef7471383cebb1ed3502738b10df5f23bd68ea331a69ccf"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.187643 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.189996 4773 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wc5dc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.190032 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" podUID="cd934a31-cbf9-4b33-831c-2622adbe4f76" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.190582 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cpc54" event={"ID":"069f92ef-1b80-414b-a631-08b572fdd419","Type":"ContainerStarted","Data":"6a314dea075b9eb5cc80bca61e189fc645762715a8ce9f0a045f0303f9b08db3"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.190616 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cpc54" event={"ID":"069f92ef-1b80-414b-a631-08b572fdd419","Type":"ContainerStarted","Data":"22e35f09479b10f9f9d5bc9e7480dcf439cf120543f2b71d82e3688e5efbed81"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.190925 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.192019 4773 patch_prober.go:28] interesting pod/console-operator-58897d9998-cpc54 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.192045 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cpc54" podUID="069f92ef-1b80-414b-a631-08b572fdd419" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.202797 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" event={"ID":"6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba","Type":"ContainerStarted","Data":"8b185577d69414b4dbe321e31cc320dc813f3ccccf3221369b4077d680a5c627"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.202850 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" event={"ID":"6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba","Type":"ContainerStarted","Data":"9a5b9429e94e867be4b3bc18e5a77087bae567f7c195af5c9c0dd29f501c3969"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.202868 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" event={"ID":"6fb9ac5d-e4a2-44dc-861c-9040aadcb4ba","Type":"ContainerStarted","Data":"c1af7df3ddfbcfef6b2c7ebfe2ec03c4d1f2f2478b0d2aa0199a047314b7ff0f"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.209260 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-m6mmg" event={"ID":"55db89f5-582b-4326-8150-4e51c83f0706","Type":"ContainerStarted","Data":"67e6cddf94eb78dc0e9d4736daa9f7429614c2d38fac9f86e6884c72f557506a"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.209296 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-m6mmg" event={"ID":"55db89f5-582b-4326-8150-4e51c83f0706","Type":"ContainerStarted","Data":"ea2fe19e2b827224ceca80961df9faeebe4660ae20089bbf5c130a2b7d5f9583"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.209869 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-m6mmg" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.213353 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-m6mmg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.213389 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m6mmg" podUID="55db89f5-582b-4326-8150-4e51c83f0706" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.214818 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" event={"ID":"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6","Type":"ContainerStarted","Data":"a674099f124e2f2d1615f7b84efdf57225669e523aa3f3160d78501ab8bf41c5"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.217638 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" event={"ID":"d1f75e12-58b5-40ee-8254-70653c1b78bf","Type":"ContainerStarted","Data":"3f644f065bb9f73e218731929b3a14f597d8acfb3b40e2a5ad4644d43f8a4753"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.219253 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" event={"ID":"b95bacee-8e7a-4f84-a635-5fe22d2a700e","Type":"ContainerStarted","Data":"9216aef08b20d38bfbe829ac705f97e4cce6016ccc804b4c85d9a7a6e98cf510"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.219278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" event={"ID":"b95bacee-8e7a-4f84-a635-5fe22d2a700e","Type":"ContainerStarted","Data":"6c3eda6b56c1395945e6915569826e50b56a5df90c47d60a3f8f6dbfe86f5684"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.222875 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" event={"ID":"7be19f86-17ad-4697-9ff0-d5b7ee06a60d","Type":"ContainerStarted","Data":"c74e8c5b41f1311eb7b129f18622e527830f9cea4aae009aeb53010a5dd12311"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.226686 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" event={"ID":"bf381381-f5d3-4217-8a9c-cf527e2c6c65","Type":"ContainerStarted","Data":"16b8e135bc463ce453c2d98ac8b95dc8361b1515c28b4778ce513e93ee752385"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.226743 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" event={"ID":"bf381381-f5d3-4217-8a9c-cf527e2c6c65","Type":"ContainerStarted","Data":"5134bbf875ab840e47b5a7cd780f964e3676906940dc9ab7670685d05eb2c038"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.233119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" event={"ID":"e5e72fa2-1994-43c0-940c-bf63a893b4c9","Type":"ContainerStarted","Data":"8acddb8f2c33cb6a77e38d8be586371622edc5107f50a875758b6c0dfd121f26"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.233158 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" event={"ID":"e5e72fa2-1994-43c0-940c-bf63a893b4c9","Type":"ContainerStarted","Data":"ebf886f5a34e4ed4007008543b6f2a4bff794232167984bcdd0474eb36fad013"} Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.236689 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20b7f5f4-5661-49f4-a045-d2c6cfc24ebf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-h54fr\" (UID: \"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.236764 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.236792 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f82c413-ade9-48f4-9f80-ca87f65b08e5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zwfzb\" (UID: \"8f82c413-ade9-48f4-9f80-ca87f65b08e5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.236809 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfe32921-9171-482d-ba52-b042ec1620c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.237691 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-bound-sa-token\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.237994 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b7f5f4-5661-49f4-a045-d2c6cfc24ebf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-h54fr\" (UID: \"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.238439 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-certificates\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.238926 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f82c413-ade9-48f4-9f80-ca87f65b08e5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zwfzb\" (UID: \"8f82c413-ade9-48f4-9f80-ca87f65b08e5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.238988 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pn28\" (UniqueName: \"kubernetes.io/projected/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-kube-api-access-7pn28\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.239110 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znqgm\" (UniqueName: \"kubernetes.io/projected/11889744-920e-4aec-b094-438235439ac5-kube-api-access-znqgm\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.239657 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.240267 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:35.740252249 +0000 UTC m=+143.976550809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240316 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-trusted-ca\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240350 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11889744-920e-4aec-b094-438235439ac5-metrics-certs\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240366 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240380 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-etcd-service-ca\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240410 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-config\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240438 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/11889744-920e-4aec-b094-438235439ac5-default-certificate\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-serving-cert\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240465 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-tls\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240511 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8spq\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-kube-api-access-h8spq\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240536 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c35b51e7-c8b0-449a-8b24-51229012cc51-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7s6c\" (UID: \"c35b51e7-c8b0-449a-8b24-51229012cc51\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240555 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/11889744-920e-4aec-b094-438235439ac5-stats-auth\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240589 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-etcd-ca\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240625 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwp4r\" (UniqueName: \"kubernetes.io/projected/cfe32921-9171-482d-ba52-b042ec1620c9-kube-api-access-wwp4r\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgcnc\" (UniqueName: \"kubernetes.io/projected/c35b51e7-c8b0-449a-8b24-51229012cc51-kube-api-access-dgcnc\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7s6c\" (UID: \"c35b51e7-c8b0-449a-8b24-51229012cc51\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240659 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b7f5f4-5661-49f4-a045-d2c6cfc24ebf-config\") pod \"kube-controller-manager-operator-78b949d7b-h54fr\" (UID: \"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240701 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11889744-920e-4aec-b094-438235439ac5-service-ca-bundle\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240735 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cfe32921-9171-482d-ba52-b042ec1620c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240781 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-etcd-client\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f82c413-ade9-48f4-9f80-ca87f65b08e5-config\") pod \"kube-apiserver-operator-766d6c64bb-zwfzb\" (UID: \"8f82c413-ade9-48f4-9f80-ca87f65b08e5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240829 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35b51e7-c8b0-449a-8b24-51229012cc51-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7s6c\" (UID: \"c35b51e7-c8b0-449a-8b24-51229012cc51\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.240871 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfe32921-9171-482d-ba52-b042ec1620c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.346286 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.346988 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8spq\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-kube-api-access-h8spq\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347031 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m488q\" (UniqueName: \"kubernetes.io/projected/2368666b-711a-4896-8027-ed3a52adc56f-kube-api-access-m488q\") pod \"service-ca-operator-777779d784-86ftc\" (UID: \"2368666b-711a-4896-8027-ed3a52adc56f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347142 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/11889744-920e-4aec-b094-438235439ac5-stats-auth\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347178 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dsgf4\" (UID: \"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347213 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwp4r\" (UniqueName: \"kubernetes.io/projected/cfe32921-9171-482d-ba52-b042ec1620c9-kube-api-access-wwp4r\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347254 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgcnc\" (UniqueName: \"kubernetes.io/projected/c35b51e7-c8b0-449a-8b24-51229012cc51-kube-api-access-dgcnc\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7s6c\" (UID: \"c35b51e7-c8b0-449a-8b24-51229012cc51\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347285 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f49wx\" (UniqueName: \"kubernetes.io/projected/0ff2ee20-e160-48f5-b794-8a1d38750b59-kube-api-access-f49wx\") pod \"catalog-operator-68c6474976-kk4lm\" (UID: \"0ff2ee20-e160-48f5-b794-8a1d38750b59\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347316 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/81a2a130-fa0c-4fe9-b311-3698669ba724-srv-cert\") pod \"olm-operator-6b444d44fb-j2cps\" (UID: \"81a2a130-fa0c-4fe9-b311-3698669ba724\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347347 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11889744-920e-4aec-b094-438235439ac5-service-ca-bundle\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cfe32921-9171-482d-ba52-b042ec1620c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347462 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f82c413-ade9-48f4-9f80-ca87f65b08e5-config\") pod \"kube-apiserver-operator-766d6c64bb-zwfzb\" (UID: \"8f82c413-ade9-48f4-9f80-ca87f65b08e5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347505 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5565c983-8814-411e-b913-0ea8e4d73c0f-secret-volume\") pod \"collect-profiles-29338335-9wxc8\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347556 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35b51e7-c8b0-449a-8b24-51229012cc51-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7s6c\" (UID: \"c35b51e7-c8b0-449a-8b24-51229012cc51\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347594 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5565c983-8814-411e-b913-0ea8e4d73c0f-config-volume\") pod \"collect-profiles-29338335-9wxc8\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfe32921-9171-482d-ba52-b042ec1620c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347702 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa851b59-ffb3-46c4-a61e-31f85d43eb7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-89dsq\" (UID: \"fa851b59-ffb3-46c4-a61e-31f85d43eb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f07f1630-495c-4a1b-9852-5dbf1887f672-webhook-cert\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347776 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xv88\" (UniqueName: \"kubernetes.io/projected/49173688-9a35-41db-a392-97d250a3780a-kube-api-access-8xv88\") pod \"package-server-manager-789f6589d5-k7zb4\" (UID: \"49173688-9a35-41db-a392-97d250a3780a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347804 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5l5n\" (UniqueName: \"kubernetes.io/projected/0c398d65-6acf-4db2-9a00-e712405e7a1e-kube-api-access-h5l5n\") pod \"dns-default-jbzcl\" (UID: \"0c398d65-6acf-4db2-9a00-e712405e7a1e\") " pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c398d65-6acf-4db2-9a00-e712405e7a1e-metrics-tls\") pod \"dns-default-jbzcl\" (UID: \"0c398d65-6acf-4db2-9a00-e712405e7a1e\") " pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347926 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c398d65-6acf-4db2-9a00-e712405e7a1e-config-volume\") pod \"dns-default-jbzcl\" (UID: \"0c398d65-6acf-4db2-9a00-e712405e7a1e\") " pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347947 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/49173688-9a35-41db-a392-97d250a3780a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-k7zb4\" (UID: \"49173688-9a35-41db-a392-97d250a3780a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347970 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfe32921-9171-482d-ba52-b042ec1620c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.347993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f82c413-ade9-48f4-9f80-ca87f65b08e5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zwfzb\" (UID: \"8f82c413-ade9-48f4-9f80-ca87f65b08e5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348027 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-registration-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b7f5f4-5661-49f4-a045-d2c6cfc24ebf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-h54fr\" (UID: \"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348091 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2368666b-711a-4896-8027-ed3a52adc56f-serving-cert\") pod \"service-ca-operator-777779d784-86ftc\" (UID: \"2368666b-711a-4896-8027-ed3a52adc56f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348139 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-plugins-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-certificates\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348206 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5d1cf94-045f-4927-8955-88a732596ec9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rmd4d\" (UID: \"e5d1cf94-045f-4927-8955-88a732596ec9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348229 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vhqs\" (UniqueName: \"kubernetes.io/projected/f4a668a7-8687-4d7c-ab9c-fc56447681d7-kube-api-access-2vhqs\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348263 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znqgm\" (UniqueName: \"kubernetes.io/projected/11889744-920e-4aec-b094-438235439ac5-kube-api-access-znqgm\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348287 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pn28\" (UniqueName: \"kubernetes.io/projected/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-kube-api-access-7pn28\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348310 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgv94\" (UniqueName: \"kubernetes.io/projected/81a2a130-fa0c-4fe9-b311-3698669ba724-kube-api-access-tgv94\") pod \"olm-operator-6b444d44fb-j2cps\" (UID: \"81a2a130-fa0c-4fe9-b311-3698669ba724\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348344 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-trusted-ca\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348381 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348403 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzmhm\" (UniqueName: \"kubernetes.io/projected/e5d1cf94-045f-4927-8955-88a732596ec9-kube-api-access-jzmhm\") pod \"kube-storage-version-migrator-operator-b67b599dd-rmd4d\" (UID: \"e5d1cf94-045f-4927-8955-88a732596ec9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348423 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/11889744-920e-4aec-b094-438235439ac5-default-certificate\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-serving-cert\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-tls\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348520 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0f59df9e-60ab-48cf-b890-f5eb8e05557e-signing-key\") pod \"service-ca-9c57cc56f-74w74\" (UID: \"0f59df9e-60ab-48cf-b890-f5eb8e05557e\") " pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348544 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-images\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348566 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0f59df9e-60ab-48cf-b890-f5eb8e05557e-signing-cabundle\") pod \"service-ca-9c57cc56f-74w74\" (UID: \"0f59df9e-60ab-48cf-b890-f5eb8e05557e\") " pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348616 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94gst\" (UniqueName: \"kubernetes.io/projected/0f59df9e-60ab-48cf-b890-f5eb8e05557e-kube-api-access-94gst\") pod \"service-ca-9c57cc56f-74w74\" (UID: \"0f59df9e-60ab-48cf-b890-f5eb8e05557e\") " pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348635 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/81a2a130-fa0c-4fe9-b311-3698669ba724-profile-collector-cert\") pod \"olm-operator-6b444d44fb-j2cps\" (UID: \"81a2a130-fa0c-4fe9-b311-3698669ba724\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348689 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c35b51e7-c8b0-449a-8b24-51229012cc51-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7s6c\" (UID: \"c35b51e7-c8b0-449a-8b24-51229012cc51\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348724 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59p4\" (UniqueName: \"kubernetes.io/projected/f07f1630-495c-4a1b-9852-5dbf1887f672-kube-api-access-c59p4\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcf5r\" (UniqueName: \"kubernetes.io/projected/8fbf52ce-1e8f-4e75-9abd-b4660de1b941-kube-api-access-qcf5r\") pod \"ingress-canary-zdjkd\" (UID: \"8fbf52ce-1e8f-4e75-9abd-b4660de1b941\") " pod="openshift-ingress-canary/ingress-canary-zdjkd" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348770 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qccps\" (UniqueName: \"kubernetes.io/projected/c36cd04d-971f-4a77-95cc-5e5493c2272f-kube-api-access-qccps\") pod \"dns-operator-744455d44c-gp2tr\" (UID: \"c36cd04d-971f-4a77-95cc-5e5493c2272f\") " pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348792 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/50e10a60-bf33-4d27-945e-49a573c2ccbc-node-bootstrap-token\") pod \"machine-config-server-84fdf\" (UID: \"50e10a60-bf33-4d27-945e-49a573c2ccbc\") " pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348844 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-etcd-ca\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348869 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7958da-ae4a-4fc3-b703-43b878f090f1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lgcxv\" (UID: \"5c7958da-ae4a-4fc3-b703-43b878f090f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348889 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fbf52ce-1e8f-4e75-9abd-b4660de1b941-cert\") pod \"ingress-canary-zdjkd\" (UID: \"8fbf52ce-1e8f-4e75-9abd-b4660de1b941\") " pod="openshift-ingress-canary/ingress-canary-zdjkd" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348953 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/50e10a60-bf33-4d27-945e-49a573c2ccbc-certs\") pod \"machine-config-server-84fdf\" (UID: \"50e10a60-bf33-4d27-945e-49a573c2ccbc\") " pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.348990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b7f5f4-5661-49f4-a045-d2c6cfc24ebf-config\") pod \"kube-controller-manager-operator-78b949d7b-h54fr\" (UID: \"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349019 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0ff2ee20-e160-48f5-b794-8a1d38750b59-srv-cert\") pod \"catalog-operator-68c6474976-kk4lm\" (UID: \"0ff2ee20-e160-48f5-b794-8a1d38750b59\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349058 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-etcd-client\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349075 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dsgf4\" (UID: \"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349096 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dsgf4\" (UID: \"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349120 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nx5ql\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349160 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/430f4c9b-ec64-4f26-a3b1-d50ed21cdaad-proxy-tls\") pod \"machine-config-controller-84d6567774-vpzth\" (UID: \"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349212 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c36cd04d-971f-4a77-95cc-5e5493c2272f-metrics-tls\") pod \"dns-operator-744455d44c-gp2tr\" (UID: \"c36cd04d-971f-4a77-95cc-5e5493c2272f\") " pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvnt2\" (UniqueName: \"kubernetes.io/projected/5565c983-8814-411e-b913-0ea8e4d73c0f-kube-api-access-hvnt2\") pod \"collect-profiles-29338335-9wxc8\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-proxy-tls\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349277 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-csi-data-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349324 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20b7f5f4-5661-49f4-a045-d2c6cfc24ebf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-h54fr\" (UID: \"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349373 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f07f1630-495c-4a1b-9852-5dbf1887f672-apiservice-cert\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349416 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-mountpoint-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349441 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-bound-sa-token\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349474 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-socket-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349506 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsw7q\" (UniqueName: \"kubernetes.io/projected/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-kube-api-access-qsw7q\") pod \"marketplace-operator-79b997595-nx5ql\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349528 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2368666b-711a-4896-8027-ed3a52adc56f-config\") pod \"service-ca-operator-777779d784-86ftc\" (UID: \"2368666b-711a-4896-8027-ed3a52adc56f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dddd7\" (UniqueName: \"kubernetes.io/projected/430f4c9b-ec64-4f26-a3b1-d50ed21cdaad-kube-api-access-dddd7\") pod \"machine-config-controller-84d6567774-vpzth\" (UID: \"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349611 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0ff2ee20-e160-48f5-b794-8a1d38750b59-profile-collector-cert\") pod \"catalog-operator-68c6474976-kk4lm\" (UID: \"0ff2ee20-e160-48f5-b794-8a1d38750b59\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349633 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f82c413-ade9-48f4-9f80-ca87f65b08e5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zwfzb\" (UID: \"8f82c413-ade9-48f4-9f80-ca87f65b08e5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349691 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92zl2\" (UniqueName: \"kubernetes.io/projected/50e10a60-bf33-4d27-945e-49a573c2ccbc-kube-api-access-92zl2\") pod \"machine-config-server-84fdf\" (UID: \"50e10a60-bf33-4d27-945e-49a573c2ccbc\") " pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349743 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ksvx\" (UniqueName: \"kubernetes.io/projected/fa851b59-ffb3-46c4-a61e-31f85d43eb7a-kube-api-access-2ksvx\") pod \"control-plane-machine-set-operator-78cbb6b69f-89dsq\" (UID: \"fa851b59-ffb3-46c4-a61e-31f85d43eb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349767 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5d1cf94-045f-4927-8955-88a732596ec9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rmd4d\" (UID: \"e5d1cf94-045f-4927-8955-88a732596ec9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349788 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11889744-920e-4aec-b094-438235439ac5-metrics-certs\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349827 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-etcd-service-ca\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349851 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb9vk\" (UniqueName: \"kubernetes.io/projected/5c7958da-ae4a-4fc3-b703-43b878f090f1-kube-api-access-cb9vk\") pod \"multus-admission-controller-857f4d67dd-lgcxv\" (UID: \"5c7958da-ae4a-4fc3-b703-43b878f090f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-config\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349918 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/430f4c9b-ec64-4f26-a3b1-d50ed21cdaad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vpzth\" (UID: \"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349939 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f07f1630-495c-4a1b-9852-5dbf1887f672-tmpfs\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.349975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5pg7\" (UniqueName: \"kubernetes.io/projected/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-kube-api-access-d5pg7\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.350015 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nx5ql\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.350037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntgf8\" (UniqueName: \"kubernetes.io/projected/1c64acaf-f1e6-4ff8-8545-f305a5c1e3d0-kube-api-access-ntgf8\") pod \"migrator-59844c95c7-cl9q6\" (UID: \"1c64acaf-f1e6-4ff8-8545-f305a5c1e3d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6" Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.350174 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:35.850153654 +0000 UTC m=+144.086452214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.357582 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-trusted-ca\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.357580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11889744-920e-4aec-b094-438235439ac5-service-ca-bundle\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.360395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.362562 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-certificates\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.364937 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-etcd-client\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.365267 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35b51e7-c8b0-449a-8b24-51229012cc51-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7s6c\" (UID: \"c35b51e7-c8b0-449a-8b24-51229012cc51\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.365426 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-config\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.366325 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-etcd-service-ca\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.366613 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/11889744-920e-4aec-b094-438235439ac5-default-certificate\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.367530 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b7f5f4-5661-49f4-a045-d2c6cfc24ebf-config\") pod \"kube-controller-manager-operator-78b949d7b-h54fr\" (UID: \"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.368699 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11889744-920e-4aec-b094-438235439ac5-metrics-certs\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.368771 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f82c413-ade9-48f4-9f80-ca87f65b08e5-config\") pod \"kube-apiserver-operator-766d6c64bb-zwfzb\" (UID: \"8f82c413-ade9-48f4-9f80-ca87f65b08e5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.369124 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b7f5f4-5661-49f4-a045-d2c6cfc24ebf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-h54fr\" (UID: \"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.371401 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfe32921-9171-482d-ba52-b042ec1620c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.371427 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-etcd-ca\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.371465 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/11889744-920e-4aec-b094-438235439ac5-stats-auth\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.377745 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8spq\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-kube-api-access-h8spq\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.378696 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cfe32921-9171-482d-ba52-b042ec1620c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.379205 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-serving-cert\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.379245 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.380092 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c35b51e7-c8b0-449a-8b24-51229012cc51-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7s6c\" (UID: \"c35b51e7-c8b0-449a-8b24-51229012cc51\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.380536 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-tls\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.382079 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f82c413-ade9-48f4-9f80-ca87f65b08e5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zwfzb\" (UID: \"8f82c413-ade9-48f4-9f80-ca87f65b08e5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.416533 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwp4r\" (UniqueName: \"kubernetes.io/projected/cfe32921-9171-482d-ba52-b042ec1620c9-kube-api-access-wwp4r\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.434942 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgcnc\" (UniqueName: \"kubernetes.io/projected/c35b51e7-c8b0-449a-8b24-51229012cc51-kube-api-access-dgcnc\") pod \"openshift-controller-manager-operator-756b6f6bc6-l7s6c\" (UID: \"c35b51e7-c8b0-449a-8b24-51229012cc51\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.451532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5565c983-8814-411e-b913-0ea8e4d73c0f-secret-volume\") pod \"collect-profiles-29338335-9wxc8\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.451781 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5565c983-8814-411e-b913-0ea8e4d73c0f-config-volume\") pod \"collect-profiles-29338335-9wxc8\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.451885 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa851b59-ffb3-46c4-a61e-31f85d43eb7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-89dsq\" (UID: \"fa851b59-ffb3-46c4-a61e-31f85d43eb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.451967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f07f1630-495c-4a1b-9852-5dbf1887f672-webhook-cert\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.452058 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xv88\" (UniqueName: \"kubernetes.io/projected/49173688-9a35-41db-a392-97d250a3780a-kube-api-access-8xv88\") pod \"package-server-manager-789f6589d5-k7zb4\" (UID: \"49173688-9a35-41db-a392-97d250a3780a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.452137 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5l5n\" (UniqueName: \"kubernetes.io/projected/0c398d65-6acf-4db2-9a00-e712405e7a1e-kube-api-access-h5l5n\") pod \"dns-default-jbzcl\" (UID: \"0c398d65-6acf-4db2-9a00-e712405e7a1e\") " pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.452244 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c398d65-6acf-4db2-9a00-e712405e7a1e-metrics-tls\") pod \"dns-default-jbzcl\" (UID: \"0c398d65-6acf-4db2-9a00-e712405e7a1e\") " pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.452342 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c398d65-6acf-4db2-9a00-e712405e7a1e-config-volume\") pod \"dns-default-jbzcl\" (UID: \"0c398d65-6acf-4db2-9a00-e712405e7a1e\") " pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.452435 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/49173688-9a35-41db-a392-97d250a3780a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-k7zb4\" (UID: \"49173688-9a35-41db-a392-97d250a3780a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.453421 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-registration-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.453536 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2368666b-711a-4896-8027-ed3a52adc56f-serving-cert\") pod \"service-ca-operator-777779d784-86ftc\" (UID: \"2368666b-711a-4896-8027-ed3a52adc56f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.453968 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-plugins-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.454063 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5d1cf94-045f-4927-8955-88a732596ec9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rmd4d\" (UID: \"e5d1cf94-045f-4927-8955-88a732596ec9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.454136 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vhqs\" (UniqueName: \"kubernetes.io/projected/f4a668a7-8687-4d7c-ab9c-fc56447681d7-kube-api-access-2vhqs\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.454217 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgv94\" (UniqueName: \"kubernetes.io/projected/81a2a130-fa0c-4fe9-b311-3698669ba724-kube-api-access-tgv94\") pod \"olm-operator-6b444d44fb-j2cps\" (UID: \"81a2a130-fa0c-4fe9-b311-3698669ba724\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.454291 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzmhm\" (UniqueName: \"kubernetes.io/projected/e5d1cf94-045f-4927-8955-88a732596ec9-kube-api-access-jzmhm\") pod \"kube-storage-version-migrator-operator-b67b599dd-rmd4d\" (UID: \"e5d1cf94-045f-4927-8955-88a732596ec9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.454379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0f59df9e-60ab-48cf-b890-f5eb8e05557e-signing-key\") pod \"service-ca-9c57cc56f-74w74\" (UID: \"0f59df9e-60ab-48cf-b890-f5eb8e05557e\") " pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.454484 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-images\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.454078 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-plugins-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.452482 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5565c983-8814-411e-b913-0ea8e4d73c0f-config-volume\") pod \"collect-profiles-29338335-9wxc8\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.453702 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-registration-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.453031 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c398d65-6acf-4db2-9a00-e712405e7a1e-config-volume\") pod \"dns-default-jbzcl\" (UID: \"0c398d65-6acf-4db2-9a00-e712405e7a1e\") " pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455216 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5d1cf94-045f-4927-8955-88a732596ec9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rmd4d\" (UID: \"e5d1cf94-045f-4927-8955-88a732596ec9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455280 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-images\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455566 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0f59df9e-60ab-48cf-b890-f5eb8e05557e-signing-cabundle\") pod \"service-ca-9c57cc56f-74w74\" (UID: \"0f59df9e-60ab-48cf-b890-f5eb8e05557e\") " pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94gst\" (UniqueName: \"kubernetes.io/projected/0f59df9e-60ab-48cf-b890-f5eb8e05557e-kube-api-access-94gst\") pod \"service-ca-9c57cc56f-74w74\" (UID: \"0f59df9e-60ab-48cf-b890-f5eb8e05557e\") " pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455742 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/81a2a130-fa0c-4fe9-b311-3698669ba724-profile-collector-cert\") pod \"olm-operator-6b444d44fb-j2cps\" (UID: \"81a2a130-fa0c-4fe9-b311-3698669ba724\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59p4\" (UniqueName: \"kubernetes.io/projected/f07f1630-495c-4a1b-9852-5dbf1887f672-kube-api-access-c59p4\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455825 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcf5r\" (UniqueName: \"kubernetes.io/projected/8fbf52ce-1e8f-4e75-9abd-b4660de1b941-kube-api-access-qcf5r\") pod \"ingress-canary-zdjkd\" (UID: \"8fbf52ce-1e8f-4e75-9abd-b4660de1b941\") " pod="openshift-ingress-canary/ingress-canary-zdjkd" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455843 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qccps\" (UniqueName: \"kubernetes.io/projected/c36cd04d-971f-4a77-95cc-5e5493c2272f-kube-api-access-qccps\") pod \"dns-operator-744455d44c-gp2tr\" (UID: \"c36cd04d-971f-4a77-95cc-5e5493c2272f\") " pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455860 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/50e10a60-bf33-4d27-945e-49a573c2ccbc-node-bootstrap-token\") pod \"machine-config-server-84fdf\" (UID: \"50e10a60-bf33-4d27-945e-49a573c2ccbc\") " pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455877 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7958da-ae4a-4fc3-b703-43b878f090f1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lgcxv\" (UID: \"5c7958da-ae4a-4fc3-b703-43b878f090f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455895 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fbf52ce-1e8f-4e75-9abd-b4660de1b941-cert\") pod \"ingress-canary-zdjkd\" (UID: \"8fbf52ce-1e8f-4e75-9abd-b4660de1b941\") " pod="openshift-ingress-canary/ingress-canary-zdjkd" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455913 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/50e10a60-bf33-4d27-945e-49a573c2ccbc-certs\") pod \"machine-config-server-84fdf\" (UID: \"50e10a60-bf33-4d27-945e-49a573c2ccbc\") " pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455931 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0ff2ee20-e160-48f5-b794-8a1d38750b59-srv-cert\") pod \"catalog-operator-68c6474976-kk4lm\" (UID: \"0ff2ee20-e160-48f5-b794-8a1d38750b59\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dsgf4\" (UID: \"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455965 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dsgf4\" (UID: \"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.455982 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nx5ql\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456009 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/430f4c9b-ec64-4f26-a3b1-d50ed21cdaad-proxy-tls\") pod \"machine-config-controller-84d6567774-vpzth\" (UID: \"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456029 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c36cd04d-971f-4a77-95cc-5e5493c2272f-metrics-tls\") pod \"dns-operator-744455d44c-gp2tr\" (UID: \"c36cd04d-971f-4a77-95cc-5e5493c2272f\") " pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456045 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvnt2\" (UniqueName: \"kubernetes.io/projected/5565c983-8814-411e-b913-0ea8e4d73c0f-kube-api-access-hvnt2\") pod \"collect-profiles-29338335-9wxc8\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456075 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-proxy-tls\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456090 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-csi-data-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456094 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5565c983-8814-411e-b913-0ea8e4d73c0f-secret-volume\") pod \"collect-profiles-29338335-9wxc8\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456335 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f07f1630-495c-4a1b-9852-5dbf1887f672-apiservice-cert\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-mountpoint-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456380 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-socket-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456399 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsw7q\" (UniqueName: \"kubernetes.io/projected/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-kube-api-access-qsw7q\") pod \"marketplace-operator-79b997595-nx5ql\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456427 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2368666b-711a-4896-8027-ed3a52adc56f-config\") pod \"service-ca-operator-777779d784-86ftc\" (UID: \"2368666b-711a-4896-8027-ed3a52adc56f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456450 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dddd7\" (UniqueName: \"kubernetes.io/projected/430f4c9b-ec64-4f26-a3b1-d50ed21cdaad-kube-api-access-dddd7\") pod \"machine-config-controller-84d6567774-vpzth\" (UID: \"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0ff2ee20-e160-48f5-b794-8a1d38750b59-profile-collector-cert\") pod \"catalog-operator-68c6474976-kk4lm\" (UID: \"0ff2ee20-e160-48f5-b794-8a1d38750b59\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456506 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92zl2\" (UniqueName: \"kubernetes.io/projected/50e10a60-bf33-4d27-945e-49a573c2ccbc-kube-api-access-92zl2\") pod \"machine-config-server-84fdf\" (UID: \"50e10a60-bf33-4d27-945e-49a573c2ccbc\") " pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456527 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ksvx\" (UniqueName: \"kubernetes.io/projected/fa851b59-ffb3-46c4-a61e-31f85d43eb7a-kube-api-access-2ksvx\") pod \"control-plane-machine-set-operator-78cbb6b69f-89dsq\" (UID: \"fa851b59-ffb3-46c4-a61e-31f85d43eb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456545 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5d1cf94-045f-4927-8955-88a732596ec9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rmd4d\" (UID: \"e5d1cf94-045f-4927-8955-88a732596ec9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456579 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456576 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c398d65-6acf-4db2-9a00-e712405e7a1e-metrics-tls\") pod \"dns-default-jbzcl\" (UID: \"0c398d65-6acf-4db2-9a00-e712405e7a1e\") " pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456601 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb9vk\" (UniqueName: \"kubernetes.io/projected/5c7958da-ae4a-4fc3-b703-43b878f090f1-kube-api-access-cb9vk\") pod \"multus-admission-controller-857f4d67dd-lgcxv\" (UID: \"5c7958da-ae4a-4fc3-b703-43b878f090f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456621 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/430f4c9b-ec64-4f26-a3b1-d50ed21cdaad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vpzth\" (UID: \"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f07f1630-495c-4a1b-9852-5dbf1887f672-tmpfs\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456654 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5pg7\" (UniqueName: \"kubernetes.io/projected/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-kube-api-access-d5pg7\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456671 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nx5ql\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntgf8\" (UniqueName: \"kubernetes.io/projected/1c64acaf-f1e6-4ff8-8545-f305a5c1e3d0-kube-api-access-ntgf8\") pod \"migrator-59844c95c7-cl9q6\" (UID: \"1c64acaf-f1e6-4ff8-8545-f305a5c1e3d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456736 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m488q\" (UniqueName: \"kubernetes.io/projected/2368666b-711a-4896-8027-ed3a52adc56f-kube-api-access-m488q\") pod \"service-ca-operator-777779d784-86ftc\" (UID: \"2368666b-711a-4896-8027-ed3a52adc56f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456759 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dsgf4\" (UID: \"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456779 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f49wx\" (UniqueName: \"kubernetes.io/projected/0ff2ee20-e160-48f5-b794-8a1d38750b59-kube-api-access-f49wx\") pod \"catalog-operator-68c6474976-kk4lm\" (UID: \"0ff2ee20-e160-48f5-b794-8a1d38750b59\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456794 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/81a2a130-fa0c-4fe9-b311-3698669ba724-srv-cert\") pod \"olm-operator-6b444d44fb-j2cps\" (UID: \"81a2a130-fa0c-4fe9-b311-3698669ba724\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.456941 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0f59df9e-60ab-48cf-b890-f5eb8e05557e-signing-key\") pod \"service-ca-9c57cc56f-74w74\" (UID: \"0f59df9e-60ab-48cf-b890-f5eb8e05557e\") " pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.461275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa851b59-ffb3-46c4-a61e-31f85d43eb7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-89dsq\" (UID: \"fa851b59-ffb3-46c4-a61e-31f85d43eb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.461703 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/49173688-9a35-41db-a392-97d250a3780a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-k7zb4\" (UID: \"49173688-9a35-41db-a392-97d250a3780a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.462104 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0ff2ee20-e160-48f5-b794-8a1d38750b59-profile-collector-cert\") pod \"catalog-operator-68c6474976-kk4lm\" (UID: \"0ff2ee20-e160-48f5-b794-8a1d38750b59\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.462467 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.462708 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f07f1630-495c-4a1b-9852-5dbf1887f672-webhook-cert\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.463027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-mountpoint-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.463094 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-socket-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.463227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/81a2a130-fa0c-4fe9-b311-3698669ba724-profile-collector-cert\") pod \"olm-operator-6b444d44fb-j2cps\" (UID: \"81a2a130-fa0c-4fe9-b311-3698669ba724\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.463317 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7958da-ae4a-4fc3-b703-43b878f090f1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lgcxv\" (UID: \"5c7958da-ae4a-4fc3-b703-43b878f090f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.463318 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/50e10a60-bf33-4d27-945e-49a573c2ccbc-certs\") pod \"machine-config-server-84fdf\" (UID: \"50e10a60-bf33-4d27-945e-49a573c2ccbc\") " pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.463708 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:35.96369153 +0000 UTC m=+144.199990090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.464082 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2368666b-711a-4896-8027-ed3a52adc56f-config\") pod \"service-ca-operator-777779d784-86ftc\" (UID: \"2368666b-711a-4896-8027-ed3a52adc56f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.462773 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dsgf4\" (UID: \"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.465037 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/50e10a60-bf33-4d27-945e-49a573c2ccbc-node-bootstrap-token\") pod \"machine-config-server-84fdf\" (UID: \"50e10a60-bf33-4d27-945e-49a573c2ccbc\") " pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.465401 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f07f1630-495c-4a1b-9852-5dbf1887f672-tmpfs\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.465568 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f4a668a7-8687-4d7c-ab9c-fc56447681d7-csi-data-dir\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.466036 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dsgf4\" (UID: \"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.466565 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/430f4c9b-ec64-4f26-a3b1-d50ed21cdaad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vpzth\" (UID: \"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.467102 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nx5ql\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.467244 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/81a2a130-fa0c-4fe9-b311-3698669ba724-srv-cert\") pod \"olm-operator-6b444d44fb-j2cps\" (UID: \"81a2a130-fa0c-4fe9-b311-3698669ba724\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.467483 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0f59df9e-60ab-48cf-b890-f5eb8e05557e-signing-cabundle\") pod \"service-ca-9c57cc56f-74w74\" (UID: \"0f59df9e-60ab-48cf-b890-f5eb8e05557e\") " pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.468101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fbf52ce-1e8f-4e75-9abd-b4660de1b941-cert\") pod \"ingress-canary-zdjkd\" (UID: \"8fbf52ce-1e8f-4e75-9abd-b4660de1b941\") " pod="openshift-ingress-canary/ingress-canary-zdjkd" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.469260 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0ff2ee20-e160-48f5-b794-8a1d38750b59-srv-cert\") pod \"catalog-operator-68c6474976-kk4lm\" (UID: \"0ff2ee20-e160-48f5-b794-8a1d38750b59\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.471038 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2368666b-711a-4896-8027-ed3a52adc56f-serving-cert\") pod \"service-ca-operator-777779d784-86ftc\" (UID: \"2368666b-711a-4896-8027-ed3a52adc56f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.472045 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f07f1630-495c-4a1b-9852-5dbf1887f672-apiservice-cert\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.472362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20b7f5f4-5661-49f4-a045-d2c6cfc24ebf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-h54fr\" (UID: \"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.472519 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nx5ql\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.474472 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/430f4c9b-ec64-4f26-a3b1-d50ed21cdaad-proxy-tls\") pod \"machine-config-controller-84d6567774-vpzth\" (UID: \"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.474909 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c36cd04d-971f-4a77-95cc-5e5493c2272f-metrics-tls\") pod \"dns-operator-744455d44c-gp2tr\" (UID: \"c36cd04d-971f-4a77-95cc-5e5493c2272f\") " pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.474922 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-proxy-tls\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.474962 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5d1cf94-045f-4927-8955-88a732596ec9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rmd4d\" (UID: \"e5d1cf94-045f-4927-8955-88a732596ec9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.487263 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfe32921-9171-482d-ba52-b042ec1620c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jp7mp\" (UID: \"cfe32921-9171-482d-ba52-b042ec1620c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.499708 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f82c413-ade9-48f4-9f80-ca87f65b08e5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zwfzb\" (UID: \"8f82c413-ade9-48f4-9f80-ca87f65b08e5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.517990 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pn28\" (UniqueName: \"kubernetes.io/projected/ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6-kube-api-access-7pn28\") pod \"etcd-operator-b45778765-55vqv\" (UID: \"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.538693 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znqgm\" (UniqueName: \"kubernetes.io/projected/11889744-920e-4aec-b094-438235439ac5-kube-api-access-znqgm\") pod \"router-default-5444994796-hrdqp\" (UID: \"11889744-920e-4aec-b094-438235439ac5\") " pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.553023 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-bound-sa-token\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.558230 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.558334 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.058303747 +0000 UTC m=+144.294602307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.558668 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.559004 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.058988566 +0000 UTC m=+144.295287126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.598766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xv88\" (UniqueName: \"kubernetes.io/projected/49173688-9a35-41db-a392-97d250a3780a-kube-api-access-8xv88\") pod \"package-server-manager-789f6589d5-k7zb4\" (UID: \"49173688-9a35-41db-a392-97d250a3780a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.600496 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.608851 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.620341 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.626192 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5l5n\" (UniqueName: \"kubernetes.io/projected/0c398d65-6acf-4db2-9a00-e712405e7a1e-kube-api-access-h5l5n\") pod \"dns-default-jbzcl\" (UID: \"0c398d65-6acf-4db2-9a00-e712405e7a1e\") " pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.635688 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vhqs\" (UniqueName: \"kubernetes.io/projected/f4a668a7-8687-4d7c-ab9c-fc56447681d7-kube-api-access-2vhqs\") pod \"csi-hostpathplugin-hxq5p\" (UID: \"f4a668a7-8687-4d7c-ab9c-fc56447681d7\") " pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.637609 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:35 crc kubenswrapper[4773]: W1012 20:26:35.653957 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11889744_920e_4aec_b094_438235439ac5.slice/crio-78747b7ec34fa8d07b771b2b9e97019b443cacf3a4a38bf2333bc5a4a38ecd7e WatchSource:0}: Error finding container 78747b7ec34fa8d07b771b2b9e97019b443cacf3a4a38bf2333bc5a4a38ecd7e: Status 404 returned error can't find the container with id 78747b7ec34fa8d07b771b2b9e97019b443cacf3a4a38bf2333bc5a4a38ecd7e Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.654302 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.659302 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.659460 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.159423095 +0000 UTC m=+144.395721655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.659759 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.659933 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.660341 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.160329001 +0000 UTC m=+144.396627561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.665077 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgv94\" (UniqueName: \"kubernetes.io/projected/81a2a130-fa0c-4fe9-b311-3698669ba724-kube-api-access-tgv94\") pod \"olm-operator-6b444d44fb-j2cps\" (UID: \"81a2a130-fa0c-4fe9-b311-3698669ba724\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.674117 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzmhm\" (UniqueName: \"kubernetes.io/projected/e5d1cf94-045f-4927-8955-88a732596ec9-kube-api-access-jzmhm\") pod \"kube-storage-version-migrator-operator-b67b599dd-rmd4d\" (UID: \"e5d1cf94-045f-4927-8955-88a732596ec9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.697351 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94gst\" (UniqueName: \"kubernetes.io/projected/0f59df9e-60ab-48cf-b890-f5eb8e05557e-kube-api-access-94gst\") pod \"service-ca-9c57cc56f-74w74\" (UID: \"0f59df9e-60ab-48cf-b890-f5eb8e05557e\") " pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.706743 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.714804 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.722577 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcf5r\" (UniqueName: \"kubernetes.io/projected/8fbf52ce-1e8f-4e75-9abd-b4660de1b941-kube-api-access-qcf5r\") pod \"ingress-canary-zdjkd\" (UID: \"8fbf52ce-1e8f-4e75-9abd-b4660de1b941\") " pod="openshift-ingress-canary/ingress-canary-zdjkd" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.739527 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.739692 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-74w74" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.745815 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59p4\" (UniqueName: \"kubernetes.io/projected/f07f1630-495c-4a1b-9852-5dbf1887f672-kube-api-access-c59p4\") pod \"packageserver-d55dfcdfc-hgxst\" (UID: \"f07f1630-495c-4a1b-9852-5dbf1887f672\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.757447 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qccps\" (UniqueName: \"kubernetes.io/projected/c36cd04d-971f-4a77-95cc-5e5493c2272f-kube-api-access-qccps\") pod \"dns-operator-744455d44c-gp2tr\" (UID: \"c36cd04d-971f-4a77-95cc-5e5493c2272f\") " pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.762191 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.762564 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.26253612 +0000 UTC m=+144.498834680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.763031 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.781372 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.784385 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92zl2\" (UniqueName: \"kubernetes.io/projected/50e10a60-bf33-4d27-945e-49a573c2ccbc-kube-api-access-92zl2\") pod \"machine-config-server-84fdf\" (UID: \"50e10a60-bf33-4d27-945e-49a573c2ccbc\") " pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.812552 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zdjkd" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.813065 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.816521 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ksvx\" (UniqueName: \"kubernetes.io/projected/fa851b59-ffb3-46c4-a61e-31f85d43eb7a-kube-api-access-2ksvx\") pod \"control-plane-machine-set-operator-78cbb6b69f-89dsq\" (UID: \"fa851b59-ffb3-46c4-a61e-31f85d43eb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.820569 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-84fdf" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.836018 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dsgf4\" (UID: \"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.838855 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsw7q\" (UniqueName: \"kubernetes.io/projected/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-kube-api-access-qsw7q\") pod \"marketplace-operator-79b997595-nx5ql\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.855164 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m488q\" (UniqueName: \"kubernetes.io/projected/2368666b-711a-4896-8027-ed3a52adc56f-kube-api-access-m488q\") pod \"service-ca-operator-777779d784-86ftc\" (UID: \"2368666b-711a-4896-8027-ed3a52adc56f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.863647 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.863970 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.363958287 +0000 UTC m=+144.600256837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.888479 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5pg7\" (UniqueName: \"kubernetes.io/projected/661f7ebe-f3ce-488c-81ea-a99f55b38c2b-kube-api-access-d5pg7\") pod \"machine-config-operator-74547568cd-pbtbs\" (UID: \"661f7ebe-f3ce-488c-81ea-a99f55b38c2b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.895516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb9vk\" (UniqueName: \"kubernetes.io/projected/5c7958da-ae4a-4fc3-b703-43b878f090f1-kube-api-access-cb9vk\") pod \"multus-admission-controller-857f4d67dd-lgcxv\" (UID: \"5c7958da-ae4a-4fc3-b703-43b878f090f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.897923 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb"] Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.917768 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntgf8\" (UniqueName: \"kubernetes.io/projected/1c64acaf-f1e6-4ff8-8545-f305a5c1e3d0-kube-api-access-ntgf8\") pod \"migrator-59844c95c7-cl9q6\" (UID: \"1c64acaf-f1e6-4ff8-8545-f305a5c1e3d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.941998 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dddd7\" (UniqueName: \"kubernetes.io/projected/430f4c9b-ec64-4f26-a3b1-d50ed21cdaad-kube-api-access-dddd7\") pod \"machine-config-controller-84d6567774-vpzth\" (UID: \"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.954311 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c"] Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.957873 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f49wx\" (UniqueName: \"kubernetes.io/projected/0ff2ee20-e160-48f5-b794-8a1d38750b59-kube-api-access-f49wx\") pod \"catalog-operator-68c6474976-kk4lm\" (UID: \"0ff2ee20-e160-48f5-b794-8a1d38750b59\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.967095 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:35 crc kubenswrapper[4773]: E1012 20:26:35.967490 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.467474573 +0000 UTC m=+144.703773133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.967939 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.975101 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.975974 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvnt2\" (UniqueName: \"kubernetes.io/projected/5565c983-8814-411e-b913-0ea8e4d73c0f-kube-api-access-hvnt2\") pod \"collect-profiles-29338335-9wxc8\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.982317 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" Oct 12 20:26:35 crc kubenswrapper[4773]: I1012 20:26:35.991426 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.011167 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.019027 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.029974 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.058514 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.062982 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr"] Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.063509 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.070892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.071300 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.571288178 +0000 UTC m=+144.807586738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.072912 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.121984 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.171625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.172401 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.672381756 +0000 UTC m=+144.908680316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.223301 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-55vqv"] Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.233867 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp"] Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.259164 4773 generic.go:334] "Generic (PLEG): container finished" podID="b95bacee-8e7a-4f84-a635-5fe22d2a700e" containerID="9216aef08b20d38bfbe829ac705f97e4cce6016ccc804b4c85d9a7a6e98cf510" exitCode=0 Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.259338 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" event={"ID":"b95bacee-8e7a-4f84-a635-5fe22d2a700e","Type":"ContainerDied","Data":"9216aef08b20d38bfbe829ac705f97e4cce6016ccc804b4c85d9a7a6e98cf510"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.259451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" event={"ID":"b95bacee-8e7a-4f84-a635-5fe22d2a700e","Type":"ContainerStarted","Data":"8c942ed3a33268ea928173f9eafc1eba06686b1edcf1d49deed2258568f42dfc"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.259633 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.276100 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.276467 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.776455037 +0000 UTC m=+145.012753597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.313421 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" event={"ID":"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6","Type":"ContainerStarted","Data":"dec783533cfe12a04aae6e622f314f2288413256bca182436f8edf10b5f63246"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.313632 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.369012 4773 generic.go:334] "Generic (PLEG): container finished" podID="d1f75e12-58b5-40ee-8254-70653c1b78bf" containerID="f5b57f64a9b24033da28a52d773e35b7881df0b2d6fff415417df463f3bb74b2" exitCode=0 Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.374738 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" event={"ID":"d1f75e12-58b5-40ee-8254-70653c1b78bf","Type":"ContainerDied","Data":"f5b57f64a9b24033da28a52d773e35b7881df0b2d6fff415417df463f3bb74b2"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.379897 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.383752 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.883692697 +0000 UTC m=+145.119991257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.392811 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-84fdf" event={"ID":"50e10a60-bf33-4d27-945e-49a573c2ccbc","Type":"ContainerStarted","Data":"40cf244be8891223741f2f47dabb9c94c182e5349209f320a65fbbcc05ce3efc"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.456602 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hrdqp" event={"ID":"11889744-920e-4aec-b094-438235439ac5","Type":"ContainerStarted","Data":"018d3c30b364f2644ff050e9b4bdd60fe81c3e9d8bb5c4388bcdb2017fff958f"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.456660 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hrdqp" event={"ID":"11889744-920e-4aec-b094-438235439ac5","Type":"ContainerStarted","Data":"78747b7ec34fa8d07b771b2b9e97019b443cacf3a4a38bf2333bc5a4a38ecd7e"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.493106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.494257 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:36.99424218 +0000 UTC m=+145.230540740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.526350 4773 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7jq2n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.526396 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" podUID="7be19f86-17ad-4697-9ff0-d5b7ee06a60d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.545610 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" event={"ID":"7be19f86-17ad-4697-9ff0-d5b7ee06a60d","Type":"ContainerStarted","Data":"a8b2f60fa9881667f103897d81382e26064f28a4908e88f25955fcabdc0aeb3a"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.545988 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.546006 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps"] Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.563534 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" event={"ID":"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf","Type":"ContainerStarted","Data":"44635b22d06b7c44d7f6ffcf04b7c092b44517b9ad2b1fad2ae734b3953655dc"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.571036 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" event={"ID":"8f82c413-ade9-48f4-9f80-ca87f65b08e5","Type":"ContainerStarted","Data":"19096cf49ea541467bf5ddda7177b00f6fe03869e5d40188959c6591975e16dc"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.593735 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.594915 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.094900126 +0000 UTC m=+145.331198686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.598206 4773 generic.go:334] "Generic (PLEG): container finished" podID="1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6" containerID="d3c70878dbe130555ac01e4fe5212d53286620afce6c4a04b49735cd402aa7ef" exitCode=0 Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.598342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" event={"ID":"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6","Type":"ContainerDied","Data":"d3c70878dbe130555ac01e4fe5212d53286620afce6c4a04b49735cd402aa7ef"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.650003 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.653753 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.653787 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.659280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" event={"ID":"e5e72fa2-1994-43c0-940c-bf63a893b4c9","Type":"ContainerStarted","Data":"5304663f32b1330c98b2f4417f5f59dea2c70c6d0630827d84c8ca417967dbf3"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.669147 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" event={"ID":"bf381381-f5d3-4217-8a9c-cf527e2c6c65","Type":"ContainerStarted","Data":"4a053b70c5b4233190842a3ce3e31ae78577ccaec61d639b61404bf15193d07c"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.672151 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" event={"ID":"c35b51e7-c8b0-449a-8b24-51229012cc51","Type":"ContainerStarted","Data":"c4d58c4fe3569c0b204a2c5c1348a50f4d5becd4b7ced07eeb75ec7381fbc3a2"} Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.679551 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-m6mmg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.679585 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m6mmg" podUID="55db89f5-582b-4326-8150-4e51c83f0706" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.696974 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.709610 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.724730 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.224701027 +0000 UTC m=+145.460999587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.761672 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hxq5p"] Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.825783 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.826067 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.326039462 +0000 UTC m=+145.562338022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.826277 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.826532 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.326519386 +0000 UTC m=+145.562817946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.858081 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-cpc54" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.877783 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zdjkd"] Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.886442 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4"] Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.891066 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.896501 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jbzcl"] Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.931697 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.964946 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d"] Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.989415 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.489395742 +0000 UTC m=+145.725694302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:36 crc kubenswrapper[4773]: I1012 20:26:36.989881 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:36 crc kubenswrapper[4773]: E1012 20:26:36.990129 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.490121452 +0000 UTC m=+145.726420002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.093153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:37 crc kubenswrapper[4773]: E1012 20:26:37.093670 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.593654359 +0000 UTC m=+145.829952919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.197397 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:37 crc kubenswrapper[4773]: E1012 20:26:37.197697 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.697685789 +0000 UTC m=+145.933984349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.311518 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:37 crc kubenswrapper[4773]: E1012 20:26:37.312035 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.812019728 +0000 UTC m=+146.048318288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.418214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:37 crc kubenswrapper[4773]: E1012 20:26:37.418523 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:37.918511717 +0000 UTC m=+146.154810277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.460967 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6"] Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.461028 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4"] Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.476022 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst"] Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.518707 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:37 crc kubenswrapper[4773]: E1012 20:26:37.519092 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.0190785 +0000 UTC m=+146.255377060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.519113 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gp2tr"] Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.620628 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:37 crc kubenswrapper[4773]: E1012 20:26:37.621103 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.121093264 +0000 UTC m=+146.357391824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.666623 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:37 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:37 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:37 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.666677 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.723349 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:37 crc kubenswrapper[4773]: E1012 20:26:37.723647 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.223621312 +0000 UTC m=+146.459919862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.729995 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-m6mmg" podStartSLOduration=125.72997611 podStartE2EDuration="2m5.72997611s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:37.702160942 +0000 UTC m=+145.938459502" watchObservedRunningTime="2025-10-12 20:26:37.72997611 +0000 UTC m=+145.966274670" Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.730544 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth"] Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.740076 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" event={"ID":"81a2a130-fa0c-4fe9-b311-3698669ba724","Type":"ContainerStarted","Data":"60ea1de89be5a79f04d2f7b4af7b5c22bb6773e8e4ad501823384c8bf949625a"} Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.780810 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" event={"ID":"cfe32921-9171-482d-ba52-b042ec1620c9","Type":"ContainerStarted","Data":"da95716eb83b66bd0112cc255942952349c75af287540085ab191034adb6bdee"} Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.781620 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" podStartSLOduration=124.781611065 podStartE2EDuration="2m4.781611065s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:37.780516484 +0000 UTC m=+146.016815044" watchObservedRunningTime="2025-10-12 20:26:37.781611065 +0000 UTC m=+146.017909625" Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.801208 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fvdbd" podStartSLOduration=125.801186072 podStartE2EDuration="2m5.801186072s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:37.739836976 +0000 UTC m=+145.976135536" watchObservedRunningTime="2025-10-12 20:26:37.801186072 +0000 UTC m=+146.037484632" Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.828993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:37 crc kubenswrapper[4773]: E1012 20:26:37.829391 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.329378431 +0000 UTC m=+146.565676991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.841402 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" event={"ID":"e5d1cf94-045f-4927-8955-88a732596ec9","Type":"ContainerStarted","Data":"c4094ddd3e8280cb3b746254f4aa6bdc04494271a7043561a0ab34575080477b"} Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.874070 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-cpc54" podStartSLOduration=125.874056041 podStartE2EDuration="2m5.874056041s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:37.869135213 +0000 UTC m=+146.105433783" watchObservedRunningTime="2025-10-12 20:26:37.874056041 +0000 UTC m=+146.110354601" Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.876352 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-74w74"] Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.936146 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:37 crc kubenswrapper[4773]: E1012 20:26:37.936513 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.436496158 +0000 UTC m=+146.672794718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.944236 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lgcxv"] Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.959459 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-84fdf" event={"ID":"50e10a60-bf33-4d27-945e-49a573c2ccbc","Type":"ContainerStarted","Data":"f787a68fe82ab058d008d0309a0ac649015b0d0b0bb879ed5c5cfeb13dbcaec8"} Oct 12 20:26:37 crc kubenswrapper[4773]: I1012 20:26:37.971943 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kx756" podStartSLOduration=125.971920309 podStartE2EDuration="2m5.971920309s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:37.958240576 +0000 UTC m=+146.194539136" watchObservedRunningTime="2025-10-12 20:26:37.971920309 +0000 UTC m=+146.208218869" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.037597 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.037961 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.537948245 +0000 UTC m=+146.774246805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.058071 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs"] Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.070028 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" podStartSLOduration=126.070004802 podStartE2EDuration="2m6.070004802s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.046628468 +0000 UTC m=+146.282927028" watchObservedRunningTime="2025-10-12 20:26:38.070004802 +0000 UTC m=+146.306303362" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.094651 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hrdqp" podStartSLOduration=125.094637361 podStartE2EDuration="2m5.094637361s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.092240134 +0000 UTC m=+146.328538694" watchObservedRunningTime="2025-10-12 20:26:38.094637361 +0000 UTC m=+146.330935921" Oct 12 20:26:38 crc kubenswrapper[4773]: W1012 20:26:38.107581 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7958da_ae4a_4fc3_b703_43b878f090f1.slice/crio-4615ce9cf6871e0b2667642b58b5ce7d518d6f78c3e331e7362d3f82291fc575 WatchSource:0}: Error finding container 4615ce9cf6871e0b2667642b58b5ce7d518d6f78c3e331e7362d3f82291fc575: Status 404 returned error can't find the container with id 4615ce9cf6871e0b2667642b58b5ce7d518d6f78c3e331e7362d3f82291fc575 Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.145832 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.146099 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.64608327 +0000 UTC m=+146.882381830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.166322 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" event={"ID":"c35b51e7-c8b0-449a-8b24-51229012cc51","Type":"ContainerStarted","Data":"44878fce34c65f4ec61b98707a18593329b704ed72a9ee3ccb6e18599389d4f8"} Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.168629 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q26mg" podStartSLOduration=125.1686162 podStartE2EDuration="2m5.1686162s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.1682602 +0000 UTC m=+146.404558760" watchObservedRunningTime="2025-10-12 20:26:38.1686162 +0000 UTC m=+146.404914750" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.208343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" event={"ID":"f4a668a7-8687-4d7c-ab9c-fc56447681d7","Type":"ContainerStarted","Data":"ec02283428180199e99499333e4571b2d3207fd81ccc17c860eb54c1e3970738"} Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.227681 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-77jxw" podStartSLOduration=126.227647072 podStartE2EDuration="2m6.227647072s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.22579578 +0000 UTC m=+146.462094340" watchObservedRunningTime="2025-10-12 20:26:38.227647072 +0000 UTC m=+146.463945632" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.247236 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.248231 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.748219797 +0000 UTC m=+146.984518357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.252639 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" event={"ID":"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6","Type":"ContainerStarted","Data":"f9358b59e75220e05b7de0482783a9e7b99cc33a8fb38734e87fd331a3239589"} Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.283390 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zdjkd" event={"ID":"8fbf52ce-1e8f-4e75-9abd-b4660de1b941","Type":"ContainerStarted","Data":"17c190b36295806d18f7a47732b9cbf22e26406b549244579bbbdbb3a19205c2"} Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.311458 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jbzcl" event={"ID":"0c398d65-6acf-4db2-9a00-e712405e7a1e","Type":"ContainerStarted","Data":"d32968cc070208d24f68af478461ef29b3bd451c60d230c887e50b6bfe310ccf"} Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.323422 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-86ftc"] Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.347834 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.348351 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmsrw" podStartSLOduration=125.348329798 podStartE2EDuration="2m5.348329798s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.344332196 +0000 UTC m=+146.580630756" watchObservedRunningTime="2025-10-12 20:26:38.348329798 +0000 UTC m=+146.584628358" Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.348758 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.84873957 +0000 UTC m=+147.085038130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.348772 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nx5ql"] Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.361747 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" event={"ID":"49173688-9a35-41db-a392-97d250a3780a","Type":"ContainerStarted","Data":"2841650ea441d2a8315b766685339606c3b06b15d776c6d653eef47f22e5dabf"} Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.362552 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-m6mmg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.362588 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m6mmg" podUID="55db89f5-582b-4326-8150-4e51c83f0706" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.455197 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.457479 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:38.957464081 +0000 UTC m=+147.193762631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.482104 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" podStartSLOduration=125.48208202 podStartE2EDuration="2m5.48208202s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.481060141 +0000 UTC m=+146.717358691" watchObservedRunningTime="2025-10-12 20:26:38.48208202 +0000 UTC m=+146.718380580" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.526154 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8"] Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.526528 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zpkkw" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.527528 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq"] Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.544368 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.556396 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.557376 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.057360516 +0000 UTC m=+147.293659076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.578861 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" podStartSLOduration=126.578843707 podStartE2EDuration="2m6.578843707s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.559443064 +0000 UTC m=+146.795741614" watchObservedRunningTime="2025-10-12 20:26:38.578843707 +0000 UTC m=+146.815142267" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.630578 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-gfc75" podStartSLOduration=126.630564734 podStartE2EDuration="2m6.630564734s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.628154266 +0000 UTC m=+146.864452826" watchObservedRunningTime="2025-10-12 20:26:38.630564734 +0000 UTC m=+146.866863294" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.631868 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm"] Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.650232 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:38 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:38 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:38 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.650292 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.660109 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.660486 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.160474 +0000 UTC m=+147.396772560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.733308 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-snnz9" podStartSLOduration=126.733290508 podStartE2EDuration="2m6.733290508s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.726826527 +0000 UTC m=+146.963125087" watchObservedRunningTime="2025-10-12 20:26:38.733290508 +0000 UTC m=+146.969589068" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.761235 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.761619 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.26160371 +0000 UTC m=+147.497902270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.809240 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-84fdf" podStartSLOduration=5.8092200720000005 podStartE2EDuration="5.809220072s" podCreationTimestamp="2025-10-12 20:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.807112023 +0000 UTC m=+147.043410583" watchObservedRunningTime="2025-10-12 20:26:38.809220072 +0000 UTC m=+147.045518632" Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.862596 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.863030 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.363013607 +0000 UTC m=+147.599312167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.971454 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.971630 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.471603455 +0000 UTC m=+147.707902015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:38 crc kubenswrapper[4773]: I1012 20:26:38.971695 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:38 crc kubenswrapper[4773]: E1012 20:26:38.972045 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.472034297 +0000 UTC m=+147.708332857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.076245 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.076807 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.576790197 +0000 UTC m=+147.813088757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.180301 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.180660 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.680647873 +0000 UTC m=+147.916946433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.281185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.281412 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.781391441 +0000 UTC m=+148.017690001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.386371 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.386952 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.886932284 +0000 UTC m=+148.123230914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.474259 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" event={"ID":"cfe32921-9171-482d-ba52-b042ec1620c9","Type":"ContainerStarted","Data":"3261793774c5bd0b0fdbb969e4b7d3b45e29e761010db6900db19b9423ad4df2"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.474303 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" event={"ID":"cfe32921-9171-482d-ba52-b042ec1620c9","Type":"ContainerStarted","Data":"a87c206e4de84e745853f67e431422a2b5b900b676317323a41bce0cf96b4cb0"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.489286 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.489575 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.989542804 +0000 UTC m=+148.225841364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.489694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.490203 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:39.990187082 +0000 UTC m=+148.226485642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.529527 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l7s6c" podStartSLOduration=126.529510472 podStartE2EDuration="2m6.529510472s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:38.930033632 +0000 UTC m=+147.166332192" watchObservedRunningTime="2025-10-12 20:26:39.529510472 +0000 UTC m=+147.765809032" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.534867 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" event={"ID":"1d6ec546-7ecf-4ec4-b44f-eb2dd5e9a1c6","Type":"ContainerStarted","Data":"4a9e7f5e3777c00a7d0fc365b4b4d5ddbae0750795aecb172915be395da055a8"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.554645 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" event={"ID":"8f82c413-ade9-48f4-9f80-ca87f65b08e5","Type":"ContainerStarted","Data":"a7296beb69654501dce85020118bf72c9be2232a883644fd5d9db96aac99a94c"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.564990 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" event={"ID":"c36cd04d-971f-4a77-95cc-5e5493c2272f","Type":"ContainerStarted","Data":"555ab5df3c7627811618ee1c01dad44116f20bbc5c0213fd685e0954a3dc9dfd"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.574750 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" event={"ID":"2368666b-711a-4896-8027-ed3a52adc56f","Type":"ContainerStarted","Data":"bd3ba2774b2c3c336018847b84c7db5b891af5dba6fa229509537c67e3e14976"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.574781 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" event={"ID":"2368666b-711a-4896-8027-ed3a52adc56f","Type":"ContainerStarted","Data":"367eddd58d4db8ef5be4875c30c5f3e73e2616d3e22c430a294c18ca0d2bb2ac"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.577427 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" event={"ID":"5565c983-8814-411e-b913-0ea8e4d73c0f","Type":"ContainerStarted","Data":"35eb4c29f0d909cc1c2e53fa8b98635eb64f3cf35abdbbd2c513a4d613f7c2c1"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.582035 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" podStartSLOduration=126.582010391 podStartE2EDuration="2m6.582010391s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:39.581517117 +0000 UTC m=+147.817815677" watchObservedRunningTime="2025-10-12 20:26:39.582010391 +0000 UTC m=+147.818308951" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.582171 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jp7mp" podStartSLOduration=126.582167665 podStartE2EDuration="2m6.582167665s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:39.532872396 +0000 UTC m=+147.769170956" watchObservedRunningTime="2025-10-12 20:26:39.582167665 +0000 UTC m=+147.818466225" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.590773 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.591141 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.091021833 +0000 UTC m=+148.327320393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.591318 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.591654 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.091647311 +0000 UTC m=+148.327945871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.616247 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zdjkd" event={"ID":"8fbf52ce-1e8f-4e75-9abd-b4660de1b941","Type":"ContainerStarted","Data":"598810546b916ba6702307873b4a550214599d19cf9bfcef10b75f77209eab45"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.657231 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" event={"ID":"49173688-9a35-41db-a392-97d250a3780a","Type":"ContainerStarted","Data":"bab89c897bf3f4b1e11493dd2e5e8ddc6dc0cc3febb4cf9a1262b6903b0421f3"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.657960 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.673613 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:39 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:39 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:39 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.673685 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.699701 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.701075 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.201057151 +0000 UTC m=+148.437355711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.723601 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwfzb" podStartSLOduration=126.723585212 podStartE2EDuration="2m6.723585212s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:39.723049967 +0000 UTC m=+147.959348527" watchObservedRunningTime="2025-10-12 20:26:39.723585212 +0000 UTC m=+147.959883772" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.766836 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86ftc" podStartSLOduration=126.76679543 podStartE2EDuration="2m6.76679543s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:39.765035711 +0000 UTC m=+148.001334271" watchObservedRunningTime="2025-10-12 20:26:39.76679543 +0000 UTC m=+148.003094000" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.797862 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-74w74" event={"ID":"0f59df9e-60ab-48cf-b890-f5eb8e05557e","Type":"ContainerStarted","Data":"87e53b3b1f8b2ef0c5649fd5bc9645e607d26dd645d294e6bf4cdeb79a6fa27a"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.797935 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-74w74" event={"ID":"0f59df9e-60ab-48cf-b890-f5eb8e05557e","Type":"ContainerStarted","Data":"5fcafeed3422ad31cdbe45791a0749c216fdded18d08422c98f0809e1fdb8217"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.820605 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.822316 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.322292243 +0000 UTC m=+148.558590803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.825398 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" event={"ID":"20b7f5f4-5661-49f4-a045-d2c6cfc24ebf","Type":"ContainerStarted","Data":"512b3cf1cca2f22cb5cd165fa7a3646f0cdabd225cc0589909b2a1e75ed60626"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.859883 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" event={"ID":"661f7ebe-f3ce-488c-81ea-a99f55b38c2b","Type":"ContainerStarted","Data":"6fe9128444ec6ba5e1e6b421996a15cad91e97af1d08f93ad35d3b938606e2d0"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.859935 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" event={"ID":"661f7ebe-f3ce-488c-81ea-a99f55b38c2b","Type":"ContainerStarted","Data":"63f11456bd2d8a5f893efe1846dd650f1769e71dad776da75ee7d68b4a4cd16a"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.890116 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" podStartSLOduration=126.89009088 podStartE2EDuration="2m6.89009088s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:39.839226207 +0000 UTC m=+148.075524767" watchObservedRunningTime="2025-10-12 20:26:39.89009088 +0000 UTC m=+148.126389440" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.892111 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6" event={"ID":"1c64acaf-f1e6-4ff8-8545-f305a5c1e3d0","Type":"ContainerStarted","Data":"3e30c8112ce7fcecaaf2053dcf896ab5057048116de0d56acf297d772e619984"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.892149 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6" event={"ID":"1c64acaf-f1e6-4ff8-8545-f305a5c1e3d0","Type":"ContainerStarted","Data":"0883417ba8cbc0c826505b53073ad9bb0faf5ad266da341a09aaaff007f6bb34"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.922834 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:39 crc kubenswrapper[4773]: E1012 20:26:39.923539 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.423521565 +0000 UTC m=+148.659820125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.947996 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" event={"ID":"0ff2ee20-e160-48f5-b794-8a1d38750b59","Type":"ContainerStarted","Data":"92ccf51bce61191f5a85dfcee31621ce659cdb907a0977d317dd658872eae05c"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.949282 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.952051 4773 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-kk4lm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.952094 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" podUID="0ff2ee20-e160-48f5-b794-8a1d38750b59" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.970504 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" event={"ID":"fa851b59-ffb3-46c4-a61e-31f85d43eb7a","Type":"ContainerStarted","Data":"b762b99fc6f81d84c503488d0137f254acc1aa161ea88cab5322dcbee673b944"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.977012 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h54fr" podStartSLOduration=126.976997751 podStartE2EDuration="2m6.976997751s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:39.975043656 +0000 UTC m=+148.211342216" watchObservedRunningTime="2025-10-12 20:26:39.976997751 +0000 UTC m=+148.213296311" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.977132 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zdjkd" podStartSLOduration=7.977128255 podStartE2EDuration="7.977128255s" podCreationTimestamp="2025-10-12 20:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:39.891686244 +0000 UTC m=+148.127984804" watchObservedRunningTime="2025-10-12 20:26:39.977128255 +0000 UTC m=+148.213426815" Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.985273 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jbzcl" event={"ID":"0c398d65-6acf-4db2-9a00-e712405e7a1e","Type":"ContainerStarted","Data":"8a8bf4408e41c6375859d4a4e77d7050fdf64d7e29a0eaefeac2935c1f5bb111"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.987048 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" event={"ID":"5c7958da-ae4a-4fc3-b703-43b878f090f1","Type":"ContainerStarted","Data":"4615ce9cf6871e0b2667642b58b5ce7d518d6f78c3e331e7362d3f82291fc575"} Oct 12 20:26:39 crc kubenswrapper[4773]: I1012 20:26:39.988830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" event={"ID":"e5d1cf94-045f-4927-8955-88a732596ec9","Type":"ContainerStarted","Data":"1687471310bc022ba0595f46be92d43e76c9a56cea80aa7e701cdd0f201e9bd1"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.000093 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" event={"ID":"cb6140cb-0bb4-4fe5-bc14-85ac2e640334","Type":"ContainerStarted","Data":"7375183bc84201c2724c425a6ab273ea033e7339c26d84be204445d263da6728"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.000146 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" event={"ID":"cb6140cb-0bb4-4fe5-bc14-85ac2e640334","Type":"ContainerStarted","Data":"b7b941bbaeeeb56acb39911424cda5b78b09f6d20c31b0b61007bb3bbf7ad925"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.001094 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.003042 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nx5ql container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.003084 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" podUID="cb6140cb-0bb4-4fe5-bc14-85ac2e640334" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.018771 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-74w74" podStartSLOduration=127.018755079 podStartE2EDuration="2m7.018755079s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:40.016251259 +0000 UTC m=+148.252549819" watchObservedRunningTime="2025-10-12 20:26:40.018755079 +0000 UTC m=+148.255053639" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.036368 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.036424 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" event={"ID":"ea8bd86d-2829-4e53-bdfe-fcd67cb59ca6","Type":"ContainerStarted","Data":"54b6b2bd08f15e21996ff9716f67e24a1e08396906c8a3590d58848950944ad8"} Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.038953 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.538936274 +0000 UTC m=+148.775234834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.068186 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" event={"ID":"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad","Type":"ContainerStarted","Data":"6117da6e76a630e1198b537e626f25cbad8c41012ad06af022b635dde3523729"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.068237 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" event={"ID":"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad","Type":"ContainerStarted","Data":"3f6db99c822d2422ce5f7e7baf3868cc492947ecf44b68841676734a04374325"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.099154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" event={"ID":"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12","Type":"ContainerStarted","Data":"51d6901d2c3d955c1d42715f976d7a9bd90069da20366f110c329546577af1b4"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.109743 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" event={"ID":"d1f75e12-58b5-40ee-8254-70653c1b78bf","Type":"ContainerStarted","Data":"d334efa490df966e61192b685864a81f971df1f496d82babec0d40a48535dd63"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.120675 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" event={"ID":"f07f1630-495c-4a1b-9852-5dbf1887f672","Type":"ContainerStarted","Data":"8798a63240168316d75dcfed5de6d8e3d1fa78c6087de51209e67e410f8237e5"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.120738 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" event={"ID":"f07f1630-495c-4a1b-9852-5dbf1887f672","Type":"ContainerStarted","Data":"abc3718a6dd48f736cad8fb57f14b6a801bf60a64894b97c054cbad370ac10d9"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.121423 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.122678 4773 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hgxst container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.122732 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" podUID="f07f1630-495c-4a1b-9852-5dbf1887f672" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.130516 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" podStartSLOduration=127.130499835 podStartE2EDuration="2m7.130499835s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:40.130095594 +0000 UTC m=+148.366394154" watchObservedRunningTime="2025-10-12 20:26:40.130499835 +0000 UTC m=+148.366798395" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.132594 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" podStartSLOduration=127.132586604 podStartE2EDuration="2m7.132586604s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:40.067669668 +0000 UTC m=+148.303968228" watchObservedRunningTime="2025-10-12 20:26:40.132586604 +0000 UTC m=+148.368885164" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.136926 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.138345 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" event={"ID":"81a2a130-fa0c-4fe9-b311-3698669ba724","Type":"ContainerStarted","Data":"edb53a471609eaa74511f747b2e2dff912ee828456307aed784a06a3a13d33c2"} Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.138458 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.139362 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.639338272 +0000 UTC m=+148.875636832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.175855 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.255348 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.255844 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.256201 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.756186891 +0000 UTC m=+148.992485451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.267912 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.293336 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-55vqv" podStartSLOduration=127.29331965 podStartE2EDuration="2m7.29331965s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:40.292881338 +0000 UTC m=+148.529179898" watchObservedRunningTime="2025-10-12 20:26:40.29331965 +0000 UTC m=+148.529618210" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.293973 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rmd4d" podStartSLOduration=127.293968438 podStartE2EDuration="2m7.293968438s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:40.197097608 +0000 UTC m=+148.433396168" watchObservedRunningTime="2025-10-12 20:26:40.293968438 +0000 UTC m=+148.530266998" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.358166 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.358397 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.85838183 +0000 UTC m=+149.094680390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.358636 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.358770 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.358798 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.358889 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.360435 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.860415587 +0000 UTC m=+149.096714147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.379558 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" podStartSLOduration=127.379531182 podStartE2EDuration="2m7.379531182s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:40.376470146 +0000 UTC m=+148.612768706" watchObservedRunningTime="2025-10-12 20:26:40.379531182 +0000 UTC m=+148.615829742" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.384946 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.385443 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.386913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.462185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.462848 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:40.962832672 +0000 UTC m=+149.199131232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.509860 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.530428 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.558698 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-j2cps" podStartSLOduration=127.558683824 podStartE2EDuration="2m7.558683824s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:40.443628735 +0000 UTC m=+148.679927295" watchObservedRunningTime="2025-10-12 20:26:40.558683824 +0000 UTC m=+148.794982374" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.559345 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" podStartSLOduration=127.559339792 podStartE2EDuration="2m7.559339792s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:40.559019883 +0000 UTC m=+148.795318443" watchObservedRunningTime="2025-10-12 20:26:40.559339792 +0000 UTC m=+148.795638352" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.564824 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.565103 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.065092923 +0000 UTC m=+149.301391483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.603570 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.646155 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:40 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:40 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:40 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.646239 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.672475 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.672874 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.172857608 +0000 UTC m=+149.409156168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.774929 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.775445 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.275433198 +0000 UTC m=+149.511731758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.876544 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.876740 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.3766783 +0000 UTC m=+149.612976860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.876812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.877227 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.377218505 +0000 UTC m=+149.613517065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.978459 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.978628 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.478592951 +0000 UTC m=+149.714891511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:40 crc kubenswrapper[4773]: I1012 20:26:40.978982 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:40 crc kubenswrapper[4773]: E1012 20:26:40.979323 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.479311721 +0000 UTC m=+149.715610271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.089298 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.090196 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.590159002 +0000 UTC m=+149.826457552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.182490 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" event={"ID":"5565c983-8814-411e-b913-0ea8e4d73c0f","Type":"ContainerStarted","Data":"f2734d002fe5e49905f4e0c20eb2c2fd5cbd5a533f6142200d8a6342f90f8c72"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.191653 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.192191 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.692172716 +0000 UTC m=+149.928471276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.199631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" event={"ID":"fa851b59-ffb3-46c4-a61e-31f85d43eb7a","Type":"ContainerStarted","Data":"ab704cbef32e7793e1a793ebf040873e74de2963dff4503e18a01b96bf50ecf0"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.213460 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" event={"ID":"49173688-9a35-41db-a392-97d250a3780a","Type":"ContainerStarted","Data":"28273712fcc2aa075a170d4083761f4896522a491448b74ea549039ea788a0d5"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.232360 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6" event={"ID":"1c64acaf-f1e6-4ff8-8545-f305a5c1e3d0","Type":"ContainerStarted","Data":"b3778d75cc00b0109332cb70fe3e1123d5ac8452761a712be6cf0ab76e3cb58e"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.260711 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" event={"ID":"d1f75e12-58b5-40ee-8254-70653c1b78bf","Type":"ContainerStarted","Data":"7a9099fdf551f316cf14b9243952e5f24a0c8be501d0bb349d02471225bcf387"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.280736 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" podStartSLOduration=128.280689602 podStartE2EDuration="2m8.280689602s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:41.272513824 +0000 UTC m=+149.508812384" watchObservedRunningTime="2025-10-12 20:26:41.280689602 +0000 UTC m=+149.516988162" Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.293137 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.293488 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.79347156 +0000 UTC m=+150.029770120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.293588 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dsgf4" event={"ID":"8f3c827e-a6d3-4b92-8ab3-4a3513e7fe12","Type":"ContainerStarted","Data":"66a5063f54a3e0d3f8358446cef9f244ff4f0b26c1fbf438d7e853872ea16c3e"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.320454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" event={"ID":"c36cd04d-971f-4a77-95cc-5e5493c2272f","Type":"ContainerStarted","Data":"a0243779075ce82904a405b5b3f8aaf9ca6c31caff5f96ec1221320a5a6d6d98"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.320500 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" event={"ID":"c36cd04d-971f-4a77-95cc-5e5493c2272f","Type":"ContainerStarted","Data":"fda0fa19eae1476e165549088858e577901f5a4474439dabf0e4c0083e18e323"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.326747 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" event={"ID":"f4a668a7-8687-4d7c-ab9c-fc56447681d7","Type":"ContainerStarted","Data":"b9279750c7ca6156c22580fb2fd6c15761df429111937dd4c2d38c10b98e2072"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.336279 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jbzcl" event={"ID":"0c398d65-6acf-4db2-9a00-e712405e7a1e","Type":"ContainerStarted","Data":"a8daec3c6e9e09e75665286086c2b460e2cbb2c36cc3fd7f71a9cf31ff31846c"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.336904 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.363551 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" event={"ID":"5c7958da-ae4a-4fc3-b703-43b878f090f1","Type":"ContainerStarted","Data":"6228e23c7510b1490ee4cf4ef8d056ab453e180555691d97e901e7ba264f71e0"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.363590 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" event={"ID":"5c7958da-ae4a-4fc3-b703-43b878f090f1","Type":"ContainerStarted","Data":"4b9a41c0b18618f4d47a46cabc2d0e4bca1f68da6124f91ecda3a85a9b940c65"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.371802 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" event={"ID":"661f7ebe-f3ce-488c-81ea-a99f55b38c2b","Type":"ContainerStarted","Data":"5120b161357db11d776c3b400c04dac6659ba3e38674cc822f65780b57bc4baa"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.394281 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.396695 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.896682447 +0000 UTC m=+150.132981007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.408497 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" event={"ID":"430f4c9b-ec64-4f26-a3b1-d50ed21cdaad","Type":"ContainerStarted","Data":"7c2eb4e13a4f8a88937ec98f4d8bed58a4ff7f0b9b2651dd9edb98b576001932"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.411140 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cl9q6" podStartSLOduration=128.411119251 podStartE2EDuration="2m8.411119251s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:41.409934508 +0000 UTC m=+149.646233068" watchObservedRunningTime="2025-10-12 20:26:41.411119251 +0000 UTC m=+149.647417811" Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.423325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" event={"ID":"0ff2ee20-e160-48f5-b794-8a1d38750b59","Type":"ContainerStarted","Data":"d54aaf42cd2ae781623664e6ec877c14fa0a7c0a667269d752b9334f98570a73"} Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.425826 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nx5ql container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.425864 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" podUID="cb6140cb-0bb4-4fe5-bc14-85ac2e640334" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.494249 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kk4lm" Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.495511 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.495732 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.995689997 +0000 UTC m=+150.231988557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.495912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.496652 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:41.996633224 +0000 UTC m=+150.232931784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.597271 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.598412 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:42.098362678 +0000 UTC m=+150.334661238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.648379 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:41 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:41 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:41 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.648425 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.701383 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.701692 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:42.201681409 +0000 UTC m=+150.437979969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.802815 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.803114 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:42.303096566 +0000 UTC m=+150.539395126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.904592 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:41 crc kubenswrapper[4773]: E1012 20:26:41.904996 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:42.404978916 +0000 UTC m=+150.641277476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:41 crc kubenswrapper[4773]: I1012 20:26:41.972970 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-89dsq" podStartSLOduration=128.972951138 podStartE2EDuration="2m8.972951138s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:41.721625657 +0000 UTC m=+149.957924217" watchObservedRunningTime="2025-10-12 20:26:41.972951138 +0000 UTC m=+150.209249698" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.006422 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.006709 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:42.506693002 +0000 UTC m=+150.742991562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.060428 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" podStartSLOduration=130.060408264 podStartE2EDuration="2m10.060408264s" podCreationTimestamp="2025-10-12 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:41.984799209 +0000 UTC m=+150.221097769" watchObservedRunningTime="2025-10-12 20:26:42.060408264 +0000 UTC m=+150.296706824" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.107341 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.107585 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:42.607575394 +0000 UTC m=+150.843873954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.208830 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.209340 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:42.70932008 +0000 UTC m=+150.945618630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.238812 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-gp2tr" podStartSLOduration=129.238794535 podStartE2EDuration="2m9.238794535s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:42.151994547 +0000 UTC m=+150.388293107" watchObservedRunningTime="2025-10-12 20:26:42.238794535 +0000 UTC m=+150.475093105" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.312415 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.313263 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:42.813250328 +0000 UTC m=+151.049548888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.428202 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.428658 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:42.928640866 +0000 UTC m=+151.164939426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.428807 4773 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hgxst container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.428838 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" podUID="f07f1630-495c-4a1b-9852-5dbf1887f672" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.468276 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jbzcl" podStartSLOduration=10.468255764 podStartE2EDuration="10.468255764s" podCreationTimestamp="2025-10-12 20:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:42.266361756 +0000 UTC m=+150.502660316" watchObservedRunningTime="2025-10-12 20:26:42.468255764 +0000 UTC m=+150.704554324" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.500321 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lgcxv" podStartSLOduration=129.500301921 podStartE2EDuration="2m9.500301921s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:42.470813786 +0000 UTC m=+150.707112346" watchObservedRunningTime="2025-10-12 20:26:42.500301921 +0000 UTC m=+150.736600471" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.510802 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" event={"ID":"f4a668a7-8687-4d7c-ab9c-fc56447681d7","Type":"ContainerStarted","Data":"182e7cbec56b78afe38b769322d4e8994d4bc48e4d7c4eb8eb3c45e8ac80661d"} Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.521444 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"82da17f205bca4607f15b46b1fed962f7569390e90dff4d8c72d4a4c10446144"} Oct 12 20:26:42 crc kubenswrapper[4773]: W1012 20:26:42.564230 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-98829199ab119e9422b9d5b6a4800d69b101e965156c96e3f45b4520ba44c9bc WatchSource:0}: Error finding container 98829199ab119e9422b9d5b6a4800d69b101e965156c96e3f45b4520ba44c9bc: Status 404 returned error can't find the container with id 98829199ab119e9422b9d5b6a4800d69b101e965156c96e3f45b4520ba44c9bc Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.568123 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.568488 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nx5ql container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.568538 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" podUID="cb6140cb-0bb4-4fe5-bc14-85ac2e640334" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.568552 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.06853878 +0000 UTC m=+151.304837340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.623242 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vpzth" podStartSLOduration=129.62322506 podStartE2EDuration="2m9.62322506s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:42.622676764 +0000 UTC m=+150.858975324" watchObservedRunningTime="2025-10-12 20:26:42.62322506 +0000 UTC m=+150.859523610" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.623736 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pbtbs" podStartSLOduration=129.623729704 podStartE2EDuration="2m9.623729704s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:42.56567097 +0000 UTC m=+150.801969530" watchObservedRunningTime="2025-10-12 20:26:42.623729704 +0000 UTC m=+150.860028264" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.644307 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:42 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:42 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:42 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.644373 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.671137 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.672616 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.172597481 +0000 UTC m=+151.408896041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.777422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.777851 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.277837605 +0000 UTC m=+151.514136165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.832826 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hgxst" Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.879162 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.879862 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.379835368 +0000 UTC m=+151.616133928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:42 crc kubenswrapper[4773]: I1012 20:26:42.980685 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:42 crc kubenswrapper[4773]: E1012 20:26:42.981038 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.481025189 +0000 UTC m=+151.717323739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.081300 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.081538 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.581525271 +0000 UTC m=+151.817823831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.182438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.182790 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.682775633 +0000 UTC m=+151.919074193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.283897 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.284326 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.784309524 +0000 UTC m=+152.020608074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.385346 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.385684 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.885671109 +0000 UTC m=+152.121969669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.486570 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.486818 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.986790028 +0000 UTC m=+152.223088588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.486926 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.487274 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:43.987266572 +0000 UTC m=+152.223565132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.527134 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6286d066cc70f23c99516994f370e069dbc93fe8e8e18f0de7b2e58aa240ee36"} Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.527182 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"98829199ab119e9422b9d5b6a4800d69b101e965156c96e3f45b4520ba44c9bc"} Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.529907 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" event={"ID":"f4a668a7-8687-4d7c-ab9c-fc56447681d7","Type":"ContainerStarted","Data":"f20326faa22e38dbb67b6a94c1f87d82c66925783d807ddb98af648f19853add"} Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.531524 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f243d906a9d06bc09642a51a9097285b45f369f0693b643fff46b3fca64f8352"} Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.533218 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"822ef2ad979b1f07a46861d2b25dd6f6934727bd2bada0e5c02170b6ca180b4e"} Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.533308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"887635fee65c4e225dfabc8aeb316686ea8879e7d3ac384e892309aad9a92c99"} Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.587806 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.587962 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.087933968 +0000 UTC m=+152.324232528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.588187 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.589238 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.089222034 +0000 UTC m=+152.325520594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.641625 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:43 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:43 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:43 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.641694 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.689254 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.689455 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.189426927 +0000 UTC m=+152.425725487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.814223 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.818142 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.318128578 +0000 UTC m=+152.554427138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.855246 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x5t25"] Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.856165 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.878060 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.916397 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.916647 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfwhf\" (UniqueName: \"kubernetes.io/projected/b5b41529-16fa-43a7-a245-34ca5e013832-kube-api-access-jfwhf\") pod \"certified-operators-x5t25\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.916686 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-catalog-content\") pod \"certified-operators-x5t25\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.916738 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-utilities\") pod \"certified-operators-x5t25\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:43 crc kubenswrapper[4773]: E1012 20:26:43.916896 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.41687981 +0000 UTC m=+152.653178370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.939609 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5t25"] Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.970782 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rx9db"] Oct 12 20:26:43 crc kubenswrapper[4773]: I1012 20:26:43.971701 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.017591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfwhf\" (UniqueName: \"kubernetes.io/projected/b5b41529-16fa-43a7-a245-34ca5e013832-kube-api-access-jfwhf\") pod \"certified-operators-x5t25\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.017643 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-catalog-content\") pod \"certified-operators-x5t25\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.017685 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-utilities\") pod \"certified-operators-x5t25\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.017705 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.017986 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.517972728 +0000 UTC m=+152.754271288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.018424 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-catalog-content\") pod \"certified-operators-x5t25\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.018475 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-utilities\") pod \"certified-operators-x5t25\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.024115 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.048864 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.048901 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.050404 4773 patch_prober.go:28] interesting pod/console-f9d7485db-gfc75 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.050452 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gfc75" podUID="ad343a90-adad-46cc-b828-93cda758fd2b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.069272 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-m6mmg container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.069587 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-m6mmg" podUID="55db89f5-582b-4326-8150-4e51c83f0706" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.070226 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-m6mmg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.070298 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-m6mmg" podUID="55db89f5-582b-4326-8150-4e51c83f0706" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.093376 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.093640 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.093680 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.094082 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.104335 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.118663 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.118935 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm2bd\" (UniqueName: \"kubernetes.io/projected/6b6ebc1e-7018-4d16-a56b-a962f165de10-kube-api-access-wm2bd\") pod \"community-operators-rx9db\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.118999 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-catalog-content\") pod \"community-operators-rx9db\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.119059 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-utilities\") pod \"community-operators-rx9db\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.119836 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.619818118 +0000 UTC m=+152.856116678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.138038 4773 patch_prober.go:28] interesting pod/apiserver-76f77b778f-n6tdw container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]log ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]etcd ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/generic-apiserver-start-informers ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/max-in-flight-filter ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 12 20:26:44 crc kubenswrapper[4773]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 12 20:26:44 crc kubenswrapper[4773]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/project.openshift.io-projectcache ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-startinformers ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 12 20:26:44 crc kubenswrapper[4773]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 12 20:26:44 crc kubenswrapper[4773]: livez check failed Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.138122 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" podUID="d1f75e12-58b5-40ee-8254-70653c1b78bf" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.152459 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfwhf\" (UniqueName: \"kubernetes.io/projected/b5b41529-16fa-43a7-a245-34ca5e013832-kube-api-access-jfwhf\") pod \"certified-operators-x5t25\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.174252 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.184399 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rx9db"] Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.219217 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9vqzr"] Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.221240 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.222059 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-catalog-content\") pod \"community-operators-rx9db\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.222108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.222147 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-utilities\") pod \"community-operators-rx9db\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.222257 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm2bd\" (UniqueName: \"kubernetes.io/projected/6b6ebc1e-7018-4d16-a56b-a962f165de10-kube-api-access-wm2bd\") pod \"community-operators-rx9db\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.223157 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-utilities\") pod \"community-operators-rx9db\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.224018 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-catalog-content\") pod \"community-operators-rx9db\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.224507 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.724495016 +0000 UTC m=+152.960793566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.264651 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vqzr"] Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.304430 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm2bd\" (UniqueName: \"kubernetes.io/projected/6b6ebc1e-7018-4d16-a56b-a962f165de10-kube-api-access-wm2bd\") pod \"community-operators-rx9db\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.324486 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.324937 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-utilities\") pod \"certified-operators-9vqzr\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.325003 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-catalog-content\") pod \"certified-operators-9vqzr\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.325026 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crzf6\" (UniqueName: \"kubernetes.io/projected/730e4bc4-2374-41d6-8fd9-c870e1931f75-kube-api-access-crzf6\") pod \"certified-operators-9vqzr\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.325252 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.825227064 +0000 UTC m=+153.061525624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.364543 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tw42c"] Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.373969 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.381312 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tw42c"] Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.426631 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.426672 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bxt\" (UniqueName: \"kubernetes.io/projected/468fec97-d137-481e-a61b-9e385a5165a5-kube-api-access-49bxt\") pod \"community-operators-tw42c\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.426695 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-catalog-content\") pod \"community-operators-tw42c\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.426725 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-utilities\") pod \"community-operators-tw42c\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.426775 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-utilities\") pod \"certified-operators-9vqzr\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.426803 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-catalog-content\") pod \"certified-operators-9vqzr\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.426846 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crzf6\" (UniqueName: \"kubernetes.io/projected/730e4bc4-2374-41d6-8fd9-c870e1931f75-kube-api-access-crzf6\") pod \"certified-operators-9vqzr\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.427519 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:44.927505645 +0000 UTC m=+153.163804205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.428461 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-utilities\") pod \"certified-operators-9vqzr\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.429358 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-catalog-content\") pod \"certified-operators-9vqzr\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.483856 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crzf6\" (UniqueName: \"kubernetes.io/projected/730e4bc4-2374-41d6-8fd9-c870e1931f75-kube-api-access-crzf6\") pod \"certified-operators-9vqzr\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.530023 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.530280 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49bxt\" (UniqueName: \"kubernetes.io/projected/468fec97-d137-481e-a61b-9e385a5165a5-kube-api-access-49bxt\") pod \"community-operators-tw42c\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.530309 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-catalog-content\") pod \"community-operators-tw42c\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.530328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-utilities\") pod \"community-operators-tw42c\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.530730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-utilities\") pod \"community-operators-tw42c\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.530814 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.030798565 +0000 UTC m=+153.267097125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.531228 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-catalog-content\") pod \"community-operators-tw42c\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.550566 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" event={"ID":"f4a668a7-8687-4d7c-ab9c-fc56447681d7","Type":"ContainerStarted","Data":"0059f3c576df78c9dfaeebabead204697ba75b6e6532af437b84112045221b6b"} Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.556817 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49bxt\" (UniqueName: \"kubernetes.io/projected/468fec97-d137-481e-a61b-9e385a5165a5-kube-api-access-49bxt\") pod \"community-operators-tw42c\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.562446 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.570993 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rh842" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.582833 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.620744 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hxq5p" podStartSLOduration=12.62069717 podStartE2EDuration="12.62069717s" podCreationTimestamp="2025-10-12 20:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:44.593419617 +0000 UTC m=+152.829718177" watchObservedRunningTime="2025-10-12 20:26:44.62069717 +0000 UTC m=+152.856995730" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.631894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.651521 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.151503042 +0000 UTC m=+153.387801602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.657442 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:44 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:44 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:44 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.657516 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.713086 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.737310 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.737629 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.237613921 +0000 UTC m=+153.473912481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.838916 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.839468 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.339435829 +0000 UTC m=+153.575734389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.847261 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5t25"] Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.940618 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.940786 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.440760864 +0000 UTC m=+153.677059424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:44 crc kubenswrapper[4773]: I1012 20:26:44.941409 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:44 crc kubenswrapper[4773]: E1012 20:26:44.941819 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.441810693 +0000 UTC m=+153.678109253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.044908 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.045210 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.545193235 +0000 UTC m=+153.781491795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.146556 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.146915 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.64689853 +0000 UTC m=+153.883197090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.176522 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.177159 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.184360 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.184700 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.213000 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.248520 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.248861 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.248983 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.249196 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.749169611 +0000 UTC m=+153.985468171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.353780 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.354183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.354220 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.354277 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.354829 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.854815006 +0000 UTC m=+154.091113566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.361938 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tw42c"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.397629 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.458257 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.458603 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:45.958587849 +0000 UTC m=+154.194886409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.469500 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rx9db"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.478569 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vqzr"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.502239 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.560453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.560813 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.060800829 +0000 UTC m=+154.297099389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.584090 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw42c" event={"ID":"468fec97-d137-481e-a61b-9e385a5165a5","Type":"ContainerStarted","Data":"023e0e5ab67a0a244d99e27e178eb829e6031e0f350f7db27752e41a2130072c"} Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.584146 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw42c" event={"ID":"468fec97-d137-481e-a61b-9e385a5165a5","Type":"ContainerStarted","Data":"8277cab2a02a61996df7a2e7d99459acd51616cd7c34e040cce72fb78dc8b8fa"} Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.586242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx9db" event={"ID":"6b6ebc1e-7018-4d16-a56b-a962f165de10","Type":"ContainerStarted","Data":"b60a0a4383009859030f1599d018fcd4e41d6d5acae2a16d089e5dabd17f77a1"} Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.588616 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vqzr" event={"ID":"730e4bc4-2374-41d6-8fd9-c870e1931f75","Type":"ContainerStarted","Data":"4f358fa97540dfb2b7922bdd2e05730abe90bf390efe9b56e8f979d66f232098"} Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.594262 4773 generic.go:334] "Generic (PLEG): container finished" podID="b5b41529-16fa-43a7-a245-34ca5e013832" containerID="7e9d5cee17eb7b46cdbceceb01ea7083d95147acabbef24b7ba5231aa9f9e08d" exitCode=0 Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.595236 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5t25" event={"ID":"b5b41529-16fa-43a7-a245-34ca5e013832","Type":"ContainerDied","Data":"7e9d5cee17eb7b46cdbceceb01ea7083d95147acabbef24b7ba5231aa9f9e08d"} Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.595255 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5t25" event={"ID":"b5b41529-16fa-43a7-a245-34ca5e013832","Type":"ContainerStarted","Data":"7639d4ffe4ec13e5174aabe0dce14cd661d43c0d89fe9007cd8aeabdbe237440"} Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.596530 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.639705 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.651328 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:45 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:45 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:45 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.651411 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.662309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.663172 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.163146412 +0000 UTC m=+154.399444972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.763676 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.764780 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.264767345 +0000 UTC m=+154.501065905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.783601 4773 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.819341 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.819993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.827287 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.827696 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.833789 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.864387 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.864590 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.364563016 +0000 UTC m=+154.600861576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.864664 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.864698 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.864784 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.865083 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.365070471 +0000 UTC m=+154.601369031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.938937 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.941072 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pqntd"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.942398 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:45 crc kubenswrapper[4773]: W1012 20:26:45.943398 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd82c578a_7570_486c_a9cf_e22e7ed52e9f.slice/crio-8b3d461bb425722e11520ed9ca250a00fbd6f206bee998930b41c1a9a467d160 WatchSource:0}: Error finding container 8b3d461bb425722e11520ed9ca250a00fbd6f206bee998930b41c1a9a467d160: Status 404 returned error can't find the container with id 8b3d461bb425722e11520ed9ca250a00fbd6f206bee998930b41c1a9a467d160 Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.945282 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.958579 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqntd"] Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.965372 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.965641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.965747 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.965875 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:45 crc kubenswrapper[4773]: E1012 20:26:45.965887 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.46586035 +0000 UTC m=+154.702158910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:45 crc kubenswrapper[4773]: I1012 20:26:45.994566 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.067808 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-utilities\") pod \"redhat-marketplace-pqntd\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.067880 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p95xv\" (UniqueName: \"kubernetes.io/projected/4d8bb41c-c5fb-48de-b561-8b2473147603-kube-api-access-p95xv\") pod \"redhat-marketplace-pqntd\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.067949 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.067982 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-catalog-content\") pod \"redhat-marketplace-pqntd\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: E1012 20:26:46.068449 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.56843177 +0000 UTC m=+154.804730330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.087320 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.141263 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.169327 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.169733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-utilities\") pod \"redhat-marketplace-pqntd\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.169764 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p95xv\" (UniqueName: \"kubernetes.io/projected/4d8bb41c-c5fb-48de-b561-8b2473147603-kube-api-access-p95xv\") pod \"redhat-marketplace-pqntd\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.169952 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-catalog-content\") pod \"redhat-marketplace-pqntd\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.170296 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-catalog-content\") pod \"redhat-marketplace-pqntd\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: E1012 20:26:46.170363 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.670348811 +0000 UTC m=+154.906647371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.170636 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-utilities\") pod \"redhat-marketplace-pqntd\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.185345 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p95xv\" (UniqueName: \"kubernetes.io/projected/4d8bb41c-c5fb-48de-b561-8b2473147603-kube-api-access-p95xv\") pod \"redhat-marketplace-pqntd\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.257498 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.271613 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:46 crc kubenswrapper[4773]: E1012 20:26:46.272060 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.772041156 +0000 UTC m=+155.008339706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.349924 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pqrvr"] Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.350919 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.363014 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqrvr"] Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.376163 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:46 crc kubenswrapper[4773]: E1012 20:26:46.376463 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.876448327 +0000 UTC m=+155.112746877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.450977 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.477233 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-catalog-content\") pod \"redhat-marketplace-pqrvr\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.477289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x8bh\" (UniqueName: \"kubernetes.io/projected/44ef61d4-c799-4aeb-9f05-4b5202a8abea-kube-api-access-6x8bh\") pod \"redhat-marketplace-pqrvr\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.477322 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.477352 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-utilities\") pod \"redhat-marketplace-pqrvr\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: E1012 20:26:46.477662 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:46.977649588 +0000 UTC m=+155.213948148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.578556 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:46 crc kubenswrapper[4773]: E1012 20:26:46.578759 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:47.078732645 +0000 UTC m=+155.315031205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.579084 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x8bh\" (UniqueName: \"kubernetes.io/projected/44ef61d4-c799-4aeb-9f05-4b5202a8abea-kube-api-access-6x8bh\") pod \"redhat-marketplace-pqrvr\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.579114 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.579145 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-utilities\") pod \"redhat-marketplace-pqrvr\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.579210 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-catalog-content\") pod \"redhat-marketplace-pqrvr\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.579558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-catalog-content\") pod \"redhat-marketplace-pqrvr\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: E1012 20:26:46.580120 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:47.080109464 +0000 UTC m=+155.316408024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.580335 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-utilities\") pod \"redhat-marketplace-pqrvr\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.606146 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqntd"] Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.620896 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x8bh\" (UniqueName: \"kubernetes.io/projected/44ef61d4-c799-4aeb-9f05-4b5202a8abea-kube-api-access-6x8bh\") pod \"redhat-marketplace-pqrvr\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.623649 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d82c578a-7570-486c-a9cf-e22e7ed52e9f","Type":"ContainerStarted","Data":"cf8ed72835763161c699b1c895dee45a5784c23b707d82dbbdc6bea8321f5f80"} Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.623682 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d82c578a-7570-486c-a9cf-e22e7ed52e9f","Type":"ContainerStarted","Data":"8b3d461bb425722e11520ed9ca250a00fbd6f206bee998930b41c1a9a467d160"} Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.639989 4773 generic.go:334] "Generic (PLEG): container finished" podID="468fec97-d137-481e-a61b-9e385a5165a5" containerID="023e0e5ab67a0a244d99e27e178eb829e6031e0f350f7db27752e41a2130072c" exitCode=0 Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.640088 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw42c" event={"ID":"468fec97-d137-481e-a61b-9e385a5165a5","Type":"ContainerDied","Data":"023e0e5ab67a0a244d99e27e178eb829e6031e0f350f7db27752e41a2130072c"} Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.644555 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.644540406 podStartE2EDuration="1.644540406s" podCreationTimestamp="2025-10-12 20:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:46.64108083 +0000 UTC m=+154.877379380" watchObservedRunningTime="2025-10-12 20:26:46.644540406 +0000 UTC m=+154.880838966" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.652016 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:46 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:46 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:46 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.652087 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.655000 4773 generic.go:334] "Generic (PLEG): container finished" podID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerID="45ff9250c99b9ee894f62afcf9fd5877934581632cc4f6c89a5808227c4ff8c1" exitCode=0 Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.655113 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx9db" event={"ID":"6b6ebc1e-7018-4d16-a56b-a962f165de10","Type":"ContainerDied","Data":"45ff9250c99b9ee894f62afcf9fd5877934581632cc4f6c89a5808227c4ff8c1"} Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.658405 4773 generic.go:334] "Generic (PLEG): container finished" podID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerID="b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d" exitCode=0 Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.658453 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vqzr" event={"ID":"730e4bc4-2374-41d6-8fd9-c870e1931f75","Type":"ContainerDied","Data":"b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d"} Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.661239 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0d07389-38be-4bdf-960b-98e9e2ce8eb4","Type":"ContainerStarted","Data":"388e42352f49f643f0df0f22d0d7e9b1271773cae30339c7a2bac9e11138c3de"} Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.675946 4773 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-12T20:26:45.783631322Z","Handler":null,"Name":""} Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.680560 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:46 crc kubenswrapper[4773]: E1012 20:26:46.681639 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 20:26:47.181609473 +0000 UTC m=+155.417908033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.682413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:46 crc kubenswrapper[4773]: E1012 20:26:46.682765 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 20:26:47.182756606 +0000 UTC m=+155.419055166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4j5xq" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.704126 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.738369 4773 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.738398 4773 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.783533 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.788561 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.885835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.900724 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.901146 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.953941 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vgdhf"] Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.955099 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.956905 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 12 20:26:46 crc kubenswrapper[4773]: I1012 20:26:46.970419 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgdhf"] Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.001111 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4j5xq\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.037916 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.089294 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-utilities\") pod \"redhat-operators-vgdhf\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.089332 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-catalog-content\") pod \"redhat-operators-vgdhf\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.089362 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfhhr\" (UniqueName: \"kubernetes.io/projected/0da0fc67-a78b-402d-ba7a-24209fea4258-kube-api-access-sfhhr\") pod \"redhat-operators-vgdhf\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.193125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfhhr\" (UniqueName: \"kubernetes.io/projected/0da0fc67-a78b-402d-ba7a-24209fea4258-kube-api-access-sfhhr\") pod \"redhat-operators-vgdhf\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.193639 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-utilities\") pod \"redhat-operators-vgdhf\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.193658 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-catalog-content\") pod \"redhat-operators-vgdhf\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.194245 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-catalog-content\") pod \"redhat-operators-vgdhf\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.194466 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-utilities\") pod \"redhat-operators-vgdhf\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.212338 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfhhr\" (UniqueName: \"kubernetes.io/projected/0da0fc67-a78b-402d-ba7a-24209fea4258-kube-api-access-sfhhr\") pod \"redhat-operators-vgdhf\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.291988 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4j5xq"] Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.307215 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.325964 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqrvr"] Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.349097 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5llck"] Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.352563 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.361941 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5llck"] Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.396877 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxqls\" (UniqueName: \"kubernetes.io/projected/da05a3bf-3763-4f69-9392-9ee4204a97c1-kube-api-access-kxqls\") pod \"redhat-operators-5llck\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.397018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-catalog-content\") pod \"redhat-operators-5llck\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.397067 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-utilities\") pod \"redhat-operators-5llck\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: W1012 20:26:47.403882 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ef61d4_c799_4aeb_9f05_4b5202a8abea.slice/crio-028f98770c9c015922d8a5d3660d8162597399050116ca0a230baee3876d616c WatchSource:0}: Error finding container 028f98770c9c015922d8a5d3660d8162597399050116ca0a230baee3876d616c: Status 404 returned error can't find the container with id 028f98770c9c015922d8a5d3660d8162597399050116ca0a230baee3876d616c Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.501839 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-catalog-content\") pod \"redhat-operators-5llck\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.501920 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-utilities\") pod \"redhat-operators-5llck\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.501965 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxqls\" (UniqueName: \"kubernetes.io/projected/da05a3bf-3763-4f69-9392-9ee4204a97c1-kube-api-access-kxqls\") pod \"redhat-operators-5llck\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.502498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-catalog-content\") pod \"redhat-operators-5llck\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.502559 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-utilities\") pod \"redhat-operators-5llck\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.532762 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxqls\" (UniqueName: \"kubernetes.io/projected/da05a3bf-3763-4f69-9392-9ee4204a97c1-kube-api-access-kxqls\") pod \"redhat-operators-5llck\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.652230 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:47 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:47 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:47 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.652298 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.710850 4773 generic.go:334] "Generic (PLEG): container finished" podID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerID="d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb" exitCode=0 Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.711433 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqrvr" event={"ID":"44ef61d4-c799-4aeb-9f05-4b5202a8abea","Type":"ContainerDied","Data":"d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb"} Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.711502 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqrvr" event={"ID":"44ef61d4-c799-4aeb-9f05-4b5202a8abea","Type":"ContainerStarted","Data":"028f98770c9c015922d8a5d3660d8162597399050116ca0a230baee3876d616c"} Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.719433 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" event={"ID":"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c","Type":"ContainerStarted","Data":"42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b"} Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.719499 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" event={"ID":"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c","Type":"ContainerStarted","Data":"b3e8204b13495af2d35a116d4f40396baca37b3af6e0f8609e4745f480f8e88b"} Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.720458 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.732160 4773 generic.go:334] "Generic (PLEG): container finished" podID="5565c983-8814-411e-b913-0ea8e4d73c0f" containerID="f2734d002fe5e49905f4e0c20eb2c2fd5cbd5a533f6142200d8a6342f90f8c72" exitCode=0 Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.732301 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" event={"ID":"5565c983-8814-411e-b913-0ea8e4d73c0f","Type":"ContainerDied","Data":"f2734d002fe5e49905f4e0c20eb2c2fd5cbd5a533f6142200d8a6342f90f8c72"} Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.737094 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vgdhf"] Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.743608 4773 generic.go:334] "Generic (PLEG): container finished" podID="d0d07389-38be-4bdf-960b-98e9e2ce8eb4" containerID="c39e49d597f2a0dbbcf746d17a9b4a4a4b597ad476d64db4dc0d0f1f81c58975" exitCode=0 Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.743700 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0d07389-38be-4bdf-960b-98e9e2ce8eb4","Type":"ContainerDied","Data":"c39e49d597f2a0dbbcf746d17a9b4a4a4b597ad476d64db4dc0d0f1f81c58975"} Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.747816 4773 generic.go:334] "Generic (PLEG): container finished" podID="d82c578a-7570-486c-a9cf-e22e7ed52e9f" containerID="cf8ed72835763161c699b1c895dee45a5784c23b707d82dbbdc6bea8321f5f80" exitCode=0 Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.747883 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d82c578a-7570-486c-a9cf-e22e7ed52e9f","Type":"ContainerDied","Data":"cf8ed72835763161c699b1c895dee45a5784c23b707d82dbbdc6bea8321f5f80"} Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.751996 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerID="665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b" exitCode=0 Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.752032 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqntd" event={"ID":"4d8bb41c-c5fb-48de-b561-8b2473147603","Type":"ContainerDied","Data":"665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b"} Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.752052 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqntd" event={"ID":"4d8bb41c-c5fb-48de-b561-8b2473147603","Type":"ContainerStarted","Data":"b29d5099615fa0dcd8bb44a3f72d72891097ea7c6552f507171cef55db667c37"} Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.759902 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:26:47 crc kubenswrapper[4773]: I1012 20:26:47.778803 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" podStartSLOduration=134.778778047 podStartE2EDuration="2m14.778778047s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:26:47.772101201 +0000 UTC m=+156.008399761" watchObservedRunningTime="2025-10-12 20:26:47.778778047 +0000 UTC m=+156.015076607" Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.169929 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5llck"] Oct 12 20:26:48 crc kubenswrapper[4773]: W1012 20:26:48.181879 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda05a3bf_3763_4f69_9392_9ee4204a97c1.slice/crio-25e06574b7e1f33a27d12e5d92ac6ac43fbef422c58c40c04c8cc5ecf6dda8f8 WatchSource:0}: Error finding container 25e06574b7e1f33a27d12e5d92ac6ac43fbef422c58c40c04c8cc5ecf6dda8f8: Status 404 returned error can't find the container with id 25e06574b7e1f33a27d12e5d92ac6ac43fbef422c58c40c04c8cc5ecf6dda8f8 Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.498158 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.641574 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:48 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:48 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:48 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.641629 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.787581 4773 generic.go:334] "Generic (PLEG): container finished" podID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerID="21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a" exitCode=0 Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.787753 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5llck" event={"ID":"da05a3bf-3763-4f69-9392-9ee4204a97c1","Type":"ContainerDied","Data":"21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a"} Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.787935 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5llck" event={"ID":"da05a3bf-3763-4f69-9392-9ee4204a97c1","Type":"ContainerStarted","Data":"25e06574b7e1f33a27d12e5d92ac6ac43fbef422c58c40c04c8cc5ecf6dda8f8"} Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.796621 4773 generic.go:334] "Generic (PLEG): container finished" podID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerID="48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c" exitCode=0 Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.796984 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgdhf" event={"ID":"0da0fc67-a78b-402d-ba7a-24209fea4258","Type":"ContainerDied","Data":"48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c"} Oct 12 20:26:48 crc kubenswrapper[4773]: I1012 20:26:48.797047 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgdhf" event={"ID":"0da0fc67-a78b-402d-ba7a-24209fea4258","Type":"ContainerStarted","Data":"3db308981d9df8d391dc87ff956c09b894ab615eb1894721268d3098d1d38ee9"} Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.099986 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.106038 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-n6tdw" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.263122 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.306298 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.346841 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kubelet-dir\") pod \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\" (UID: \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\") " Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.346879 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kubelet-dir\") pod \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\" (UID: \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\") " Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.346936 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kube-api-access\") pod \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\" (UID: \"d82c578a-7570-486c-a9cf-e22e7ed52e9f\") " Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.346958 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kube-api-access\") pod \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\" (UID: \"d0d07389-38be-4bdf-960b-98e9e2ce8eb4\") " Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.347202 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d82c578a-7570-486c-a9cf-e22e7ed52e9f" (UID: "d82c578a-7570-486c-a9cf-e22e7ed52e9f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.347250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d0d07389-38be-4bdf-960b-98e9e2ce8eb4" (UID: "d0d07389-38be-4bdf-960b-98e9e2ce8eb4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.365874 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d82c578a-7570-486c-a9cf-e22e7ed52e9f" (UID: "d82c578a-7570-486c-a9cf-e22e7ed52e9f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.369961 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d0d07389-38be-4bdf-960b-98e9e2ce8eb4" (UID: "d0d07389-38be-4bdf-960b-98e9e2ce8eb4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.440570 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.452683 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.452734 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.452747 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82c578a-7570-486c-a9cf-e22e7ed52e9f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.452761 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0d07389-38be-4bdf-960b-98e9e2ce8eb4-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.561197 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5565c983-8814-411e-b913-0ea8e4d73c0f-secret-volume\") pod \"5565c983-8814-411e-b913-0ea8e4d73c0f\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.561361 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvnt2\" (UniqueName: \"kubernetes.io/projected/5565c983-8814-411e-b913-0ea8e4d73c0f-kube-api-access-hvnt2\") pod \"5565c983-8814-411e-b913-0ea8e4d73c0f\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.561408 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5565c983-8814-411e-b913-0ea8e4d73c0f-config-volume\") pod \"5565c983-8814-411e-b913-0ea8e4d73c0f\" (UID: \"5565c983-8814-411e-b913-0ea8e4d73c0f\") " Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.563408 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5565c983-8814-411e-b913-0ea8e4d73c0f-config-volume" (OuterVolumeSpecName: "config-volume") pod "5565c983-8814-411e-b913-0ea8e4d73c0f" (UID: "5565c983-8814-411e-b913-0ea8e4d73c0f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.578961 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5565c983-8814-411e-b913-0ea8e4d73c0f-kube-api-access-hvnt2" (OuterVolumeSpecName: "kube-api-access-hvnt2") pod "5565c983-8814-411e-b913-0ea8e4d73c0f" (UID: "5565c983-8814-411e-b913-0ea8e4d73c0f"). InnerVolumeSpecName "kube-api-access-hvnt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.580566 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5565c983-8814-411e-b913-0ea8e4d73c0f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5565c983-8814-411e-b913-0ea8e4d73c0f" (UID: "5565c983-8814-411e-b913-0ea8e4d73c0f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.652703 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:49 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:49 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:49 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.652791 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.666135 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5565c983-8814-411e-b913-0ea8e4d73c0f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.666186 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvnt2\" (UniqueName: \"kubernetes.io/projected/5565c983-8814-411e-b913-0ea8e4d73c0f-kube-api-access-hvnt2\") on node \"crc\" DevicePath \"\"" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.666201 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5565c983-8814-411e-b913-0ea8e4d73c0f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.825454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" event={"ID":"5565c983-8814-411e-b913-0ea8e4d73c0f","Type":"ContainerDied","Data":"35eb4c29f0d909cc1c2e53fa8b98635eb64f3cf35abdbbd2c513a4d613f7c2c1"} Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.825495 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35eb4c29f0d909cc1c2e53fa8b98635eb64f3cf35abdbbd2c513a4d613f7c2c1" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.825550 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.828147 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0d07389-38be-4bdf-960b-98e9e2ce8eb4","Type":"ContainerDied","Data":"388e42352f49f643f0df0f22d0d7e9b1271773cae30339c7a2bac9e11138c3de"} Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.828171 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388e42352f49f643f0df0f22d0d7e9b1271773cae30339c7a2bac9e11138c3de" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.828209 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.869273 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.869317 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d82c578a-7570-486c-a9cf-e22e7ed52e9f","Type":"ContainerDied","Data":"8b3d461bb425722e11520ed9ca250a00fbd6f206bee998930b41c1a9a467d160"} Oct 12 20:26:49 crc kubenswrapper[4773]: I1012 20:26:49.869337 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b3d461bb425722e11520ed9ca250a00fbd6f206bee998930b41c1a9a467d160" Oct 12 20:26:50 crc kubenswrapper[4773]: I1012 20:26:50.605997 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:26:50 crc kubenswrapper[4773]: I1012 20:26:50.643208 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:50 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:50 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:50 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:50 crc kubenswrapper[4773]: I1012 20:26:50.643273 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:50 crc kubenswrapper[4773]: I1012 20:26:50.744598 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jbzcl" Oct 12 20:26:51 crc kubenswrapper[4773]: I1012 20:26:51.640605 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:51 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:51 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:51 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:51 crc kubenswrapper[4773]: I1012 20:26:51.640681 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:52 crc kubenswrapper[4773]: I1012 20:26:52.641746 4773 patch_prober.go:28] interesting pod/router-default-5444994796-hrdqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 20:26:52 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Oct 12 20:26:52 crc kubenswrapper[4773]: [+]process-running ok Oct 12 20:26:52 crc kubenswrapper[4773]: healthz check failed Oct 12 20:26:52 crc kubenswrapper[4773]: I1012 20:26:52.641867 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hrdqp" podUID="11889744-920e-4aec-b094-438235439ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 20:26:53 crc kubenswrapper[4773]: I1012 20:26:53.641513 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:53 crc kubenswrapper[4773]: I1012 20:26:53.643985 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hrdqp" Oct 12 20:26:54 crc kubenswrapper[4773]: I1012 20:26:54.048666 4773 patch_prober.go:28] interesting pod/console-f9d7485db-gfc75 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 12 20:26:54 crc kubenswrapper[4773]: I1012 20:26:54.048748 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gfc75" podUID="ad343a90-adad-46cc-b828-93cda758fd2b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 12 20:26:54 crc kubenswrapper[4773]: I1012 20:26:54.075182 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-m6mmg" Oct 12 20:26:55 crc kubenswrapper[4773]: I1012 20:26:55.795226 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:55 crc kubenswrapper[4773]: I1012 20:26:55.802174 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0e0fa58-fcd9-4002-a975-a98fcba0f364-metrics-certs\") pod \"network-metrics-daemon-6sbfz\" (UID: \"a0e0fa58-fcd9-4002-a975-a98fcba0f364\") " pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:55 crc kubenswrapper[4773]: I1012 20:26:55.839750 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6sbfz" Oct 12 20:26:58 crc kubenswrapper[4773]: I1012 20:26:58.669900 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:26:58 crc kubenswrapper[4773]: I1012 20:26:58.670264 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:27:04 crc kubenswrapper[4773]: I1012 20:27:04.056734 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:27:04 crc kubenswrapper[4773]: I1012 20:27:04.065522 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:27:07 crc kubenswrapper[4773]: I1012 20:27:07.042941 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:27:14 crc kubenswrapper[4773]: E1012 20:27:14.087130 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 12 20:27:14 crc kubenswrapper[4773]: E1012 20:27:14.087671 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-49bxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-tw42c_openshift-marketplace(468fec97-d137-481e-a61b-9e385a5165a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 12 20:27:14 crc kubenswrapper[4773]: E1012 20:27:14.089139 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-tw42c" podUID="468fec97-d137-481e-a61b-9e385a5165a5" Oct 12 20:27:14 crc kubenswrapper[4773]: E1012 20:27:14.135381 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 12 20:27:14 crc kubenswrapper[4773]: E1012 20:27:14.135557 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jfwhf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x5t25_openshift-marketplace(b5b41529-16fa-43a7-a245-34ca5e013832): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 12 20:27:14 crc kubenswrapper[4773]: E1012 20:27:14.136791 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x5t25" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" Oct 12 20:27:15 crc kubenswrapper[4773]: I1012 20:27:15.770622 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-k7zb4" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.524310 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x5t25" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.524368 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-tw42c" podUID="468fec97-d137-481e-a61b-9e385a5165a5" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.638042 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.638621 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfhhr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-vgdhf_openshift-marketplace(0da0fc67-a78b-402d-ba7a-24209fea4258): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.639712 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-vgdhf" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.644392 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.644637 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wm2bd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rx9db_openshift-marketplace(6b6ebc1e-7018-4d16-a56b-a962f165de10): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.645870 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rx9db" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.655174 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.655331 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kxqls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5llck_openshift-marketplace(da05a3bf-3763-4f69-9392-9ee4204a97c1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 12 20:27:17 crc kubenswrapper[4773]: E1012 20:27:17.656444 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5llck" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" Oct 12 20:27:18 crc kubenswrapper[4773]: I1012 20:27:18.050220 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6sbfz"] Oct 12 20:27:18 crc kubenswrapper[4773]: W1012 20:27:18.082646 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0e0fa58_fcd9_4002_a975_a98fcba0f364.slice/crio-1372d98a745615630fe9f8016aa80c32dc3c70feecd3d7bcf8d659d13b4ffc7d WatchSource:0}: Error finding container 1372d98a745615630fe9f8016aa80c32dc3c70feecd3d7bcf8d659d13b4ffc7d: Status 404 returned error can't find the container with id 1372d98a745615630fe9f8016aa80c32dc3c70feecd3d7bcf8d659d13b4ffc7d Oct 12 20:27:18 crc kubenswrapper[4773]: I1012 20:27:18.176568 4773 generic.go:334] "Generic (PLEG): container finished" podID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerID="92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b" exitCode=0 Oct 12 20:27:18 crc kubenswrapper[4773]: I1012 20:27:18.176676 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vqzr" event={"ID":"730e4bc4-2374-41d6-8fd9-c870e1931f75","Type":"ContainerDied","Data":"92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b"} Oct 12 20:27:18 crc kubenswrapper[4773]: I1012 20:27:18.178917 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" event={"ID":"a0e0fa58-fcd9-4002-a975-a98fcba0f364","Type":"ContainerStarted","Data":"1372d98a745615630fe9f8016aa80c32dc3c70feecd3d7bcf8d659d13b4ffc7d"} Oct 12 20:27:18 crc kubenswrapper[4773]: I1012 20:27:18.185140 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerID="9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4" exitCode=0 Oct 12 20:27:18 crc kubenswrapper[4773]: I1012 20:27:18.185201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqntd" event={"ID":"4d8bb41c-c5fb-48de-b561-8b2473147603","Type":"ContainerDied","Data":"9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4"} Oct 12 20:27:18 crc kubenswrapper[4773]: I1012 20:27:18.192264 4773 generic.go:334] "Generic (PLEG): container finished" podID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerID="fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677" exitCode=0 Oct 12 20:27:18 crc kubenswrapper[4773]: I1012 20:27:18.193554 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqrvr" event={"ID":"44ef61d4-c799-4aeb-9f05-4b5202a8abea","Type":"ContainerDied","Data":"fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677"} Oct 12 20:27:18 crc kubenswrapper[4773]: E1012 20:27:18.195249 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5llck" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" Oct 12 20:27:18 crc kubenswrapper[4773]: E1012 20:27:18.196385 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rx9db" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" Oct 12 20:27:18 crc kubenswrapper[4773]: E1012 20:27:18.199615 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-vgdhf" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" Oct 12 20:27:19 crc kubenswrapper[4773]: I1012 20:27:19.200347 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vqzr" event={"ID":"730e4bc4-2374-41d6-8fd9-c870e1931f75","Type":"ContainerStarted","Data":"d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818"} Oct 12 20:27:19 crc kubenswrapper[4773]: I1012 20:27:19.201765 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" event={"ID":"a0e0fa58-fcd9-4002-a975-a98fcba0f364","Type":"ContainerStarted","Data":"0998f6fa093b149c2380caddefe035b479df4f9b8928a439a64636ba3eccec25"} Oct 12 20:27:19 crc kubenswrapper[4773]: I1012 20:27:19.201808 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6sbfz" event={"ID":"a0e0fa58-fcd9-4002-a975-a98fcba0f364","Type":"ContainerStarted","Data":"e5349c547d838a4ca7cf21adc360d828f57a4dea60971679efe19904926da16a"} Oct 12 20:27:19 crc kubenswrapper[4773]: I1012 20:27:19.204569 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqntd" event={"ID":"4d8bb41c-c5fb-48de-b561-8b2473147603","Type":"ContainerStarted","Data":"77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146"} Oct 12 20:27:19 crc kubenswrapper[4773]: I1012 20:27:19.206911 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqrvr" event={"ID":"44ef61d4-c799-4aeb-9f05-4b5202a8abea","Type":"ContainerStarted","Data":"6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f"} Oct 12 20:27:19 crc kubenswrapper[4773]: I1012 20:27:19.258210 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9vqzr" podStartSLOduration=3.324720815 podStartE2EDuration="35.258181401s" podCreationTimestamp="2025-10-12 20:26:44 +0000 UTC" firstStartedPulling="2025-10-12 20:26:46.670554554 +0000 UTC m=+154.906853114" lastFinishedPulling="2025-10-12 20:27:18.60401514 +0000 UTC m=+186.840313700" observedRunningTime="2025-10-12 20:27:19.234747545 +0000 UTC m=+187.471046115" watchObservedRunningTime="2025-10-12 20:27:19.258181401 +0000 UTC m=+187.494479961" Oct 12 20:27:19 crc kubenswrapper[4773]: I1012 20:27:19.259551 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pqrvr" podStartSLOduration=2.337730474 podStartE2EDuration="33.259544729s" podCreationTimestamp="2025-10-12 20:26:46 +0000 UTC" firstStartedPulling="2025-10-12 20:26:47.716072443 +0000 UTC m=+155.952371003" lastFinishedPulling="2025-10-12 20:27:18.637886698 +0000 UTC m=+186.874185258" observedRunningTime="2025-10-12 20:27:19.259476797 +0000 UTC m=+187.495775357" watchObservedRunningTime="2025-10-12 20:27:19.259544729 +0000 UTC m=+187.495843289" Oct 12 20:27:19 crc kubenswrapper[4773]: I1012 20:27:19.294944 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pqntd" podStartSLOduration=3.316726226 podStartE2EDuration="34.294907858s" podCreationTimestamp="2025-10-12 20:26:45 +0000 UTC" firstStartedPulling="2025-10-12 20:26:47.754885239 +0000 UTC m=+155.991183799" lastFinishedPulling="2025-10-12 20:27:18.733066881 +0000 UTC m=+186.969365431" observedRunningTime="2025-10-12 20:27:19.2788899 +0000 UTC m=+187.515188460" watchObservedRunningTime="2025-10-12 20:27:19.294907858 +0000 UTC m=+187.531206418" Oct 12 20:27:19 crc kubenswrapper[4773]: I1012 20:27:19.297422 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6sbfz" podStartSLOduration=166.297416109 podStartE2EDuration="2m46.297416109s" podCreationTimestamp="2025-10-12 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:27:19.29461892 +0000 UTC m=+187.530917480" watchObservedRunningTime="2025-10-12 20:27:19.297416109 +0000 UTC m=+187.533714669" Oct 12 20:27:20 crc kubenswrapper[4773]: I1012 20:27:20.612022 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 20:27:24 crc kubenswrapper[4773]: I1012 20:27:24.564079 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:27:24 crc kubenswrapper[4773]: I1012 20:27:24.569122 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:27:24 crc kubenswrapper[4773]: I1012 20:27:24.764781 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:27:25 crc kubenswrapper[4773]: I1012 20:27:25.281031 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:27:25 crc kubenswrapper[4773]: I1012 20:27:25.374143 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vqzr"] Oct 12 20:27:26 crc kubenswrapper[4773]: I1012 20:27:26.258446 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:27:26 crc kubenswrapper[4773]: I1012 20:27:26.259657 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:27:26 crc kubenswrapper[4773]: I1012 20:27:26.296629 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:27:26 crc kubenswrapper[4773]: I1012 20:27:26.704862 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:27:26 crc kubenswrapper[4773]: I1012 20:27:26.704908 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:27:26 crc kubenswrapper[4773]: I1012 20:27:26.738763 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.253489 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9vqzr" podUID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerName="registry-server" containerID="cri-o://d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818" gracePeriod=2 Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.316131 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.323943 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.780371 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqrvr"] Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.782008 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.871981 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-utilities\") pod \"730e4bc4-2374-41d6-8fd9-c870e1931f75\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.872152 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-catalog-content\") pod \"730e4bc4-2374-41d6-8fd9-c870e1931f75\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.872241 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crzf6\" (UniqueName: \"kubernetes.io/projected/730e4bc4-2374-41d6-8fd9-c870e1931f75-kube-api-access-crzf6\") pod \"730e4bc4-2374-41d6-8fd9-c870e1931f75\" (UID: \"730e4bc4-2374-41d6-8fd9-c870e1931f75\") " Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.873346 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-utilities" (OuterVolumeSpecName: "utilities") pod "730e4bc4-2374-41d6-8fd9-c870e1931f75" (UID: "730e4bc4-2374-41d6-8fd9-c870e1931f75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.873611 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.878693 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/730e4bc4-2374-41d6-8fd9-c870e1931f75-kube-api-access-crzf6" (OuterVolumeSpecName: "kube-api-access-crzf6") pod "730e4bc4-2374-41d6-8fd9-c870e1931f75" (UID: "730e4bc4-2374-41d6-8fd9-c870e1931f75"). InnerVolumeSpecName "kube-api-access-crzf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.920914 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "730e4bc4-2374-41d6-8fd9-c870e1931f75" (UID: "730e4bc4-2374-41d6-8fd9-c870e1931f75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.974941 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crzf6\" (UniqueName: \"kubernetes.io/projected/730e4bc4-2374-41d6-8fd9-c870e1931f75-kube-api-access-crzf6\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:27 crc kubenswrapper[4773]: I1012 20:27:27.974998 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/730e4bc4-2374-41d6-8fd9-c870e1931f75-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.260441 4773 generic.go:334] "Generic (PLEG): container finished" podID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerID="d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818" exitCode=0 Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.260515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vqzr" event={"ID":"730e4bc4-2374-41d6-8fd9-c870e1931f75","Type":"ContainerDied","Data":"d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818"} Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.260601 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vqzr" event={"ID":"730e4bc4-2374-41d6-8fd9-c870e1931f75","Type":"ContainerDied","Data":"4f358fa97540dfb2b7922bdd2e05730abe90bf390efe9b56e8f979d66f232098"} Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.260700 4773 scope.go:117] "RemoveContainer" containerID="d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.261081 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vqzr" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.282026 4773 scope.go:117] "RemoveContainer" containerID="92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.292213 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vqzr"] Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.300077 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9vqzr"] Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.315297 4773 scope.go:117] "RemoveContainer" containerID="b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.330629 4773 scope.go:117] "RemoveContainer" containerID="d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818" Oct 12 20:27:28 crc kubenswrapper[4773]: E1012 20:27:28.333462 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818\": container with ID starting with d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818 not found: ID does not exist" containerID="d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.333527 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818"} err="failed to get container status \"d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818\": rpc error: code = NotFound desc = could not find container \"d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818\": container with ID starting with d43c40eefae98006a514ef029cd1d394be4bc2cc412f966ad314ac679eb0e818 not found: ID does not exist" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.333619 4773 scope.go:117] "RemoveContainer" containerID="92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b" Oct 12 20:27:28 crc kubenswrapper[4773]: E1012 20:27:28.334313 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b\": container with ID starting with 92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b not found: ID does not exist" containerID="92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.334392 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b"} err="failed to get container status \"92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b\": rpc error: code = NotFound desc = could not find container \"92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b\": container with ID starting with 92bd4f488de461b44de219fbd6699322ccda6ec86bff86d317eb0f151c96117b not found: ID does not exist" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.334456 4773 scope.go:117] "RemoveContainer" containerID="b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d" Oct 12 20:27:28 crc kubenswrapper[4773]: E1012 20:27:28.334899 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d\": container with ID starting with b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d not found: ID does not exist" containerID="b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.334932 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d"} err="failed to get container status \"b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d\": rpc error: code = NotFound desc = could not find container \"b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d\": container with ID starting with b9a3c166c5ae0b274252b2a43db04c2535e6a3f30ce40c76559df381fc0d789d not found: ID does not exist" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.506887 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="730e4bc4-2374-41d6-8fd9-c870e1931f75" path="/var/lib/kubelet/pods/730e4bc4-2374-41d6-8fd9-c870e1931f75/volumes" Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.669510 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:27:28 crc kubenswrapper[4773]: I1012 20:27:28.670099 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.269665 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pqrvr" podUID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerName="registry-server" containerID="cri-o://6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f" gracePeriod=2 Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.708967 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.797964 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-catalog-content\") pod \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.798067 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x8bh\" (UniqueName: \"kubernetes.io/projected/44ef61d4-c799-4aeb-9f05-4b5202a8abea-kube-api-access-6x8bh\") pod \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.803627 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ef61d4-c799-4aeb-9f05-4b5202a8abea-kube-api-access-6x8bh" (OuterVolumeSpecName: "kube-api-access-6x8bh") pod "44ef61d4-c799-4aeb-9f05-4b5202a8abea" (UID: "44ef61d4-c799-4aeb-9f05-4b5202a8abea"). InnerVolumeSpecName "kube-api-access-6x8bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.818924 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44ef61d4-c799-4aeb-9f05-4b5202a8abea" (UID: "44ef61d4-c799-4aeb-9f05-4b5202a8abea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.899101 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-utilities\") pod \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\" (UID: \"44ef61d4-c799-4aeb-9f05-4b5202a8abea\") " Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.899469 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.899486 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x8bh\" (UniqueName: \"kubernetes.io/projected/44ef61d4-c799-4aeb-9f05-4b5202a8abea-kube-api-access-6x8bh\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:29 crc kubenswrapper[4773]: I1012 20:27:29.901003 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-utilities" (OuterVolumeSpecName: "utilities") pod "44ef61d4-c799-4aeb-9f05-4b5202a8abea" (UID: "44ef61d4-c799-4aeb-9f05-4b5202a8abea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.001153 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44ef61d4-c799-4aeb-9f05-4b5202a8abea-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.276125 4773 generic.go:334] "Generic (PLEG): container finished" podID="b5b41529-16fa-43a7-a245-34ca5e013832" containerID="e1e4f602c4e10b92a7928f99936673eb795dd71e4946217238eec9aacebc4b49" exitCode=0 Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.276215 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5t25" event={"ID":"b5b41529-16fa-43a7-a245-34ca5e013832","Type":"ContainerDied","Data":"e1e4f602c4e10b92a7928f99936673eb795dd71e4946217238eec9aacebc4b49"} Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.283442 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw42c" event={"ID":"468fec97-d137-481e-a61b-9e385a5165a5","Type":"ContainerStarted","Data":"ca7750d7ab691eb9da9ae0aa60893535e25affb133036be7af114c53654c031f"} Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.285606 4773 generic.go:334] "Generic (PLEG): container finished" podID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerID="6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f" exitCode=0 Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.285648 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqrvr" event={"ID":"44ef61d4-c799-4aeb-9f05-4b5202a8abea","Type":"ContainerDied","Data":"6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f"} Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.285670 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqrvr" event={"ID":"44ef61d4-c799-4aeb-9f05-4b5202a8abea","Type":"ContainerDied","Data":"028f98770c9c015922d8a5d3660d8162597399050116ca0a230baee3876d616c"} Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.285692 4773 scope.go:117] "RemoveContainer" containerID="6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.285754 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqrvr" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.327148 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqrvr"] Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.329213 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqrvr"] Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.336666 4773 scope.go:117] "RemoveContainer" containerID="fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.356161 4773 scope.go:117] "RemoveContainer" containerID="d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.369990 4773 scope.go:117] "RemoveContainer" containerID="6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f" Oct 12 20:27:30 crc kubenswrapper[4773]: E1012 20:27:30.370437 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f\": container with ID starting with 6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f not found: ID does not exist" containerID="6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.370564 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f"} err="failed to get container status \"6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f\": rpc error: code = NotFound desc = could not find container \"6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f\": container with ID starting with 6b38e1f8554ac8158b2c4b20ecd182cc5d57e5e1d3b858a35bd2d82723807c0f not found: ID does not exist" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.370673 4773 scope.go:117] "RemoveContainer" containerID="fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677" Oct 12 20:27:30 crc kubenswrapper[4773]: E1012 20:27:30.371135 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677\": container with ID starting with fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677 not found: ID does not exist" containerID="fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.371177 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677"} err="failed to get container status \"fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677\": rpc error: code = NotFound desc = could not find container \"fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677\": container with ID starting with fd3737e3d105fb6c12bed57e8346f007a6b0e3f47b8480ec4c127271d0b66677 not found: ID does not exist" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.371215 4773 scope.go:117] "RemoveContainer" containerID="d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb" Oct 12 20:27:30 crc kubenswrapper[4773]: E1012 20:27:30.371463 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb\": container with ID starting with d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb not found: ID does not exist" containerID="d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.371497 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb"} err="failed to get container status \"d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb\": rpc error: code = NotFound desc = could not find container \"d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb\": container with ID starting with d6cef43c5c22764e729367f351b1da592bc3150eb221357076bdc7ab4ba69eeb not found: ID does not exist" Oct 12 20:27:30 crc kubenswrapper[4773]: I1012 20:27:30.490370 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" path="/var/lib/kubelet/pods/44ef61d4-c799-4aeb-9f05-4b5202a8abea/volumes" Oct 12 20:27:31 crc kubenswrapper[4773]: I1012 20:27:31.291892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx9db" event={"ID":"6b6ebc1e-7018-4d16-a56b-a962f165de10","Type":"ContainerStarted","Data":"e24e3bee0b0104b29862a08fa1b194dcad514dd1f4cbdb378dfb82543927ac84"} Oct 12 20:27:31 crc kubenswrapper[4773]: I1012 20:27:31.293586 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5t25" event={"ID":"b5b41529-16fa-43a7-a245-34ca5e013832","Type":"ContainerStarted","Data":"0c401204b88942c2bd5e4bf95cb130807d9b43a34745f4c4826b28cdb59d3a9d"} Oct 12 20:27:31 crc kubenswrapper[4773]: I1012 20:27:31.294922 4773 generic.go:334] "Generic (PLEG): container finished" podID="468fec97-d137-481e-a61b-9e385a5165a5" containerID="ca7750d7ab691eb9da9ae0aa60893535e25affb133036be7af114c53654c031f" exitCode=0 Oct 12 20:27:31 crc kubenswrapper[4773]: I1012 20:27:31.294995 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw42c" event={"ID":"468fec97-d137-481e-a61b-9e385a5165a5","Type":"ContainerDied","Data":"ca7750d7ab691eb9da9ae0aa60893535e25affb133036be7af114c53654c031f"} Oct 12 20:27:31 crc kubenswrapper[4773]: I1012 20:27:31.353005 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x5t25" podStartSLOduration=3.168595638 podStartE2EDuration="48.352990366s" podCreationTimestamp="2025-10-12 20:26:43 +0000 UTC" firstStartedPulling="2025-10-12 20:26:45.59624605 +0000 UTC m=+153.832551520" lastFinishedPulling="2025-10-12 20:27:30.780647658 +0000 UTC m=+199.016946248" observedRunningTime="2025-10-12 20:27:31.350010832 +0000 UTC m=+199.586309392" watchObservedRunningTime="2025-10-12 20:27:31.352990366 +0000 UTC m=+199.589288926" Oct 12 20:27:32 crc kubenswrapper[4773]: I1012 20:27:32.318647 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgdhf" event={"ID":"0da0fc67-a78b-402d-ba7a-24209fea4258","Type":"ContainerStarted","Data":"ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c"} Oct 12 20:27:32 crc kubenswrapper[4773]: I1012 20:27:32.324933 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw42c" event={"ID":"468fec97-d137-481e-a61b-9e385a5165a5","Type":"ContainerStarted","Data":"6f4129154d9fa43649361342d13df14782d2b58ec4bd18e0fddfdd210c3c5a30"} Oct 12 20:27:32 crc kubenswrapper[4773]: I1012 20:27:32.329927 4773 generic.go:334] "Generic (PLEG): container finished" podID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerID="e24e3bee0b0104b29862a08fa1b194dcad514dd1f4cbdb378dfb82543927ac84" exitCode=0 Oct 12 20:27:32 crc kubenswrapper[4773]: I1012 20:27:32.330080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx9db" event={"ID":"6b6ebc1e-7018-4d16-a56b-a962f165de10","Type":"ContainerDied","Data":"e24e3bee0b0104b29862a08fa1b194dcad514dd1f4cbdb378dfb82543927ac84"} Oct 12 20:27:32 crc kubenswrapper[4773]: I1012 20:27:32.413795 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tw42c" podStartSLOduration=3.346540668 podStartE2EDuration="48.413779238s" podCreationTimestamp="2025-10-12 20:26:44 +0000 UTC" firstStartedPulling="2025-10-12 20:26:46.649079683 +0000 UTC m=+154.885378243" lastFinishedPulling="2025-10-12 20:27:31.716318253 +0000 UTC m=+199.952616813" observedRunningTime="2025-10-12 20:27:32.41105126 +0000 UTC m=+200.647349820" watchObservedRunningTime="2025-10-12 20:27:32.413779238 +0000 UTC m=+200.650077798" Oct 12 20:27:33 crc kubenswrapper[4773]: I1012 20:27:33.340291 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx9db" event={"ID":"6b6ebc1e-7018-4d16-a56b-a962f165de10","Type":"ContainerStarted","Data":"b4adc26dd0d2a5c2a5aa920312024c1e5c8a8d02b483bf04f288263d89bb7cde"} Oct 12 20:27:33 crc kubenswrapper[4773]: I1012 20:27:33.341805 4773 generic.go:334] "Generic (PLEG): container finished" podID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerID="ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c" exitCode=0 Oct 12 20:27:33 crc kubenswrapper[4773]: I1012 20:27:33.341833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgdhf" event={"ID":"0da0fc67-a78b-402d-ba7a-24209fea4258","Type":"ContainerDied","Data":"ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c"} Oct 12 20:27:33 crc kubenswrapper[4773]: I1012 20:27:33.344699 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5llck" event={"ID":"da05a3bf-3763-4f69-9392-9ee4204a97c1","Type":"ContainerStarted","Data":"c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656"} Oct 12 20:27:33 crc kubenswrapper[4773]: I1012 20:27:33.374420 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rx9db" podStartSLOduration=4.200844627 podStartE2EDuration="50.374402693s" podCreationTimestamp="2025-10-12 20:26:43 +0000 UTC" firstStartedPulling="2025-10-12 20:26:46.670872643 +0000 UTC m=+154.907171203" lastFinishedPulling="2025-10-12 20:27:32.844430719 +0000 UTC m=+201.080729269" observedRunningTime="2025-10-12 20:27:33.371330556 +0000 UTC m=+201.607629116" watchObservedRunningTime="2025-10-12 20:27:33.374402693 +0000 UTC m=+201.610701253" Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.175251 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.175579 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.224017 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.352772 4773 generic.go:334] "Generic (PLEG): container finished" podID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerID="c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656" exitCode=0 Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.352877 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5llck" event={"ID":"da05a3bf-3763-4f69-9392-9ee4204a97c1","Type":"ContainerDied","Data":"c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656"} Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.585584 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.585671 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.713878 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.714027 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:27:34 crc kubenswrapper[4773]: I1012 20:27:34.764373 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:27:35 crc kubenswrapper[4773]: I1012 20:27:35.645841 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rx9db" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerName="registry-server" probeResult="failure" output=< Oct 12 20:27:35 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 20:27:35 crc kubenswrapper[4773]: > Oct 12 20:27:36 crc kubenswrapper[4773]: I1012 20:27:36.422785 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:27:37 crc kubenswrapper[4773]: I1012 20:27:37.374584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgdhf" event={"ID":"0da0fc67-a78b-402d-ba7a-24209fea4258","Type":"ContainerStarted","Data":"dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9"} Oct 12 20:27:37 crc kubenswrapper[4773]: I1012 20:27:37.398638 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vgdhf" podStartSLOduration=3.8678453680000002 podStartE2EDuration="51.398611397s" podCreationTimestamp="2025-10-12 20:26:46 +0000 UTC" firstStartedPulling="2025-10-12 20:26:48.799064799 +0000 UTC m=+157.035363359" lastFinishedPulling="2025-10-12 20:27:36.329830818 +0000 UTC m=+204.566129388" observedRunningTime="2025-10-12 20:27:37.393951875 +0000 UTC m=+205.630250435" watchObservedRunningTime="2025-10-12 20:27:37.398611397 +0000 UTC m=+205.634909967" Oct 12 20:27:38 crc kubenswrapper[4773]: I1012 20:27:38.374630 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tw42c"] Oct 12 20:27:38 crc kubenswrapper[4773]: I1012 20:27:38.389099 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tw42c" podUID="468fec97-d137-481e-a61b-9e385a5165a5" containerName="registry-server" containerID="cri-o://6f4129154d9fa43649361342d13df14782d2b58ec4bd18e0fddfdd210c3c5a30" gracePeriod=2 Oct 12 20:27:39 crc kubenswrapper[4773]: I1012 20:27:39.397469 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5llck" event={"ID":"da05a3bf-3763-4f69-9392-9ee4204a97c1","Type":"ContainerStarted","Data":"c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362"} Oct 12 20:27:39 crc kubenswrapper[4773]: I1012 20:27:39.419900 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5llck" podStartSLOduration=2.949828149 podStartE2EDuration="52.419877419s" podCreationTimestamp="2025-10-12 20:26:47 +0000 UTC" firstStartedPulling="2025-10-12 20:26:48.796828327 +0000 UTC m=+157.033126887" lastFinishedPulling="2025-10-12 20:27:38.266877597 +0000 UTC m=+206.503176157" observedRunningTime="2025-10-12 20:27:39.418521761 +0000 UTC m=+207.654820321" watchObservedRunningTime="2025-10-12 20:27:39.419877419 +0000 UTC m=+207.656175979" Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.404083 4773 generic.go:334] "Generic (PLEG): container finished" podID="468fec97-d137-481e-a61b-9e385a5165a5" containerID="6f4129154d9fa43649361342d13df14782d2b58ec4bd18e0fddfdd210c3c5a30" exitCode=0 Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.404140 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw42c" event={"ID":"468fec97-d137-481e-a61b-9e385a5165a5","Type":"ContainerDied","Data":"6f4129154d9fa43649361342d13df14782d2b58ec4bd18e0fddfdd210c3c5a30"} Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.717482 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.857727 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-catalog-content\") pod \"468fec97-d137-481e-a61b-9e385a5165a5\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.857803 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-utilities\") pod \"468fec97-d137-481e-a61b-9e385a5165a5\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.857840 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49bxt\" (UniqueName: \"kubernetes.io/projected/468fec97-d137-481e-a61b-9e385a5165a5-kube-api-access-49bxt\") pod \"468fec97-d137-481e-a61b-9e385a5165a5\" (UID: \"468fec97-d137-481e-a61b-9e385a5165a5\") " Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.858834 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-utilities" (OuterVolumeSpecName: "utilities") pod "468fec97-d137-481e-a61b-9e385a5165a5" (UID: "468fec97-d137-481e-a61b-9e385a5165a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.880112 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468fec97-d137-481e-a61b-9e385a5165a5-kube-api-access-49bxt" (OuterVolumeSpecName: "kube-api-access-49bxt") pod "468fec97-d137-481e-a61b-9e385a5165a5" (UID: "468fec97-d137-481e-a61b-9e385a5165a5"). InnerVolumeSpecName "kube-api-access-49bxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.920193 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "468fec97-d137-481e-a61b-9e385a5165a5" (UID: "468fec97-d137-481e-a61b-9e385a5165a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.959837 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.959885 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/468fec97-d137-481e-a61b-9e385a5165a5-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:40 crc kubenswrapper[4773]: I1012 20:27:40.959898 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49bxt\" (UniqueName: \"kubernetes.io/projected/468fec97-d137-481e-a61b-9e385a5165a5-kube-api-access-49bxt\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:41 crc kubenswrapper[4773]: I1012 20:27:41.410792 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw42c" event={"ID":"468fec97-d137-481e-a61b-9e385a5165a5","Type":"ContainerDied","Data":"8277cab2a02a61996df7a2e7d99459acd51616cd7c34e040cce72fb78dc8b8fa"} Oct 12 20:27:41 crc kubenswrapper[4773]: I1012 20:27:41.410863 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw42c" Oct 12 20:27:41 crc kubenswrapper[4773]: I1012 20:27:41.410867 4773 scope.go:117] "RemoveContainer" containerID="6f4129154d9fa43649361342d13df14782d2b58ec4bd18e0fddfdd210c3c5a30" Oct 12 20:27:41 crc kubenswrapper[4773]: I1012 20:27:41.428844 4773 scope.go:117] "RemoveContainer" containerID="ca7750d7ab691eb9da9ae0aa60893535e25affb133036be7af114c53654c031f" Oct 12 20:27:41 crc kubenswrapper[4773]: I1012 20:27:41.449423 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tw42c"] Oct 12 20:27:41 crc kubenswrapper[4773]: I1012 20:27:41.457011 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tw42c"] Oct 12 20:27:41 crc kubenswrapper[4773]: I1012 20:27:41.460804 4773 scope.go:117] "RemoveContainer" containerID="023e0e5ab67a0a244d99e27e178eb829e6031e0f350f7db27752e41a2130072c" Oct 12 20:27:42 crc kubenswrapper[4773]: I1012 20:27:42.486630 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468fec97-d137-481e-a61b-9e385a5165a5" path="/var/lib/kubelet/pods/468fec97-d137-481e-a61b-9e385a5165a5/volumes" Oct 12 20:27:44 crc kubenswrapper[4773]: I1012 20:27:44.212375 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:27:44 crc kubenswrapper[4773]: I1012 20:27:44.625827 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:27:44 crc kubenswrapper[4773]: I1012 20:27:44.663501 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:27:47 crc kubenswrapper[4773]: I1012 20:27:47.308976 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:27:47 crc kubenswrapper[4773]: I1012 20:27:47.309631 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:27:47 crc kubenswrapper[4773]: I1012 20:27:47.350124 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:27:47 crc kubenswrapper[4773]: I1012 20:27:47.486693 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:27:47 crc kubenswrapper[4773]: I1012 20:27:47.760559 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:27:47 crc kubenswrapper[4773]: I1012 20:27:47.761219 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:27:47 crc kubenswrapper[4773]: I1012 20:27:47.930025 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:27:48 crc kubenswrapper[4773]: I1012 20:27:48.489596 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:27:49 crc kubenswrapper[4773]: I1012 20:27:49.379634 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5llck"] Oct 12 20:27:50 crc kubenswrapper[4773]: I1012 20:27:50.458685 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5llck" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerName="registry-server" containerID="cri-o://c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362" gracePeriod=2 Oct 12 20:27:50 crc kubenswrapper[4773]: I1012 20:27:50.865408 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.003134 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-catalog-content\") pod \"da05a3bf-3763-4f69-9392-9ee4204a97c1\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.003302 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-utilities\") pod \"da05a3bf-3763-4f69-9392-9ee4204a97c1\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.003424 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxqls\" (UniqueName: \"kubernetes.io/projected/da05a3bf-3763-4f69-9392-9ee4204a97c1-kube-api-access-kxqls\") pod \"da05a3bf-3763-4f69-9392-9ee4204a97c1\" (UID: \"da05a3bf-3763-4f69-9392-9ee4204a97c1\") " Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.004443 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-utilities" (OuterVolumeSpecName: "utilities") pod "da05a3bf-3763-4f69-9392-9ee4204a97c1" (UID: "da05a3bf-3763-4f69-9392-9ee4204a97c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.023064 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da05a3bf-3763-4f69-9392-9ee4204a97c1-kube-api-access-kxqls" (OuterVolumeSpecName: "kube-api-access-kxqls") pod "da05a3bf-3763-4f69-9392-9ee4204a97c1" (UID: "da05a3bf-3763-4f69-9392-9ee4204a97c1"). InnerVolumeSpecName "kube-api-access-kxqls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.105629 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxqls\" (UniqueName: \"kubernetes.io/projected/da05a3bf-3763-4f69-9392-9ee4204a97c1-kube-api-access-kxqls\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.106133 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.110358 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da05a3bf-3763-4f69-9392-9ee4204a97c1" (UID: "da05a3bf-3763-4f69-9392-9ee4204a97c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.207451 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da05a3bf-3763-4f69-9392-9ee4204a97c1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.468026 4773 generic.go:334] "Generic (PLEG): container finished" podID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerID="c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362" exitCode=0 Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.468084 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5llck" event={"ID":"da05a3bf-3763-4f69-9392-9ee4204a97c1","Type":"ContainerDied","Data":"c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362"} Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.468174 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5llck" event={"ID":"da05a3bf-3763-4f69-9392-9ee4204a97c1","Type":"ContainerDied","Data":"25e06574b7e1f33a27d12e5d92ac6ac43fbef422c58c40c04c8cc5ecf6dda8f8"} Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.468212 4773 scope.go:117] "RemoveContainer" containerID="c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.468211 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5llck" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.491636 4773 scope.go:117] "RemoveContainer" containerID="c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.508618 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5llck"] Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.510777 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5llck"] Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.516360 4773 scope.go:117] "RemoveContainer" containerID="21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.544036 4773 scope.go:117] "RemoveContainer" containerID="c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362" Oct 12 20:27:51 crc kubenswrapper[4773]: E1012 20:27:51.544767 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362\": container with ID starting with c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362 not found: ID does not exist" containerID="c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.544808 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362"} err="failed to get container status \"c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362\": rpc error: code = NotFound desc = could not find container \"c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362\": container with ID starting with c11afa33378b9b75b6875db69e13b34e6f5254f68b2da458a962a3f401149362 not found: ID does not exist" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.544837 4773 scope.go:117] "RemoveContainer" containerID="c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656" Oct 12 20:27:51 crc kubenswrapper[4773]: E1012 20:27:51.545368 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656\": container with ID starting with c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656 not found: ID does not exist" containerID="c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.545399 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656"} err="failed to get container status \"c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656\": rpc error: code = NotFound desc = could not find container \"c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656\": container with ID starting with c7814cab67640220544df32e371b6c64ff5455b256e9b5bb780b5aedda686656 not found: ID does not exist" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.545415 4773 scope.go:117] "RemoveContainer" containerID="21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a" Oct 12 20:27:51 crc kubenswrapper[4773]: E1012 20:27:51.545830 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a\": container with ID starting with 21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a not found: ID does not exist" containerID="21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a" Oct 12 20:27:51 crc kubenswrapper[4773]: I1012 20:27:51.545857 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a"} err="failed to get container status \"21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a\": rpc error: code = NotFound desc = could not find container \"21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a\": container with ID starting with 21a1e788bc9e289cf5b69da0c41c3ac0d189d1aaa34f311f06ddd3aa92f73b2a not found: ID does not exist" Oct 12 20:27:52 crc kubenswrapper[4773]: I1012 20:27:52.487405 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" path="/var/lib/kubelet/pods/da05a3bf-3763-4f69-9392-9ee4204a97c1/volumes" Oct 12 20:27:54 crc kubenswrapper[4773]: I1012 20:27:54.379430 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7jq2n"] Oct 12 20:27:58 crc kubenswrapper[4773]: I1012 20:27:58.669098 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:27:58 crc kubenswrapper[4773]: I1012 20:27:58.669431 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:27:58 crc kubenswrapper[4773]: I1012 20:27:58.669487 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:27:58 crc kubenswrapper[4773]: I1012 20:27:58.670177 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 20:27:58 crc kubenswrapper[4773]: I1012 20:27:58.670233 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c" gracePeriod=600 Oct 12 20:27:59 crc kubenswrapper[4773]: I1012 20:27:59.512364 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c" exitCode=0 Oct 12 20:27:59 crc kubenswrapper[4773]: I1012 20:27:59.512430 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c"} Oct 12 20:27:59 crc kubenswrapper[4773]: I1012 20:27:59.512757 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"971ea4f63e4f5f53d552a948f94e7131fae037c3dab2edf2ed90e1f0e80cdf66"} Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.417561 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" podUID="7be19f86-17ad-4697-9ff0-d5b7ee06a60d" containerName="oauth-openshift" containerID="cri-o://a8b2f60fa9881667f103897d81382e26064f28a4908e88f25955fcabdc0aeb3a" gracePeriod=15 Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.648090 4773 generic.go:334] "Generic (PLEG): container finished" podID="7be19f86-17ad-4697-9ff0-d5b7ee06a60d" containerID="a8b2f60fa9881667f103897d81382e26064f28a4908e88f25955fcabdc0aeb3a" exitCode=0 Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.648157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" event={"ID":"7be19f86-17ad-4697-9ff0-d5b7ee06a60d","Type":"ContainerDied","Data":"a8b2f60fa9881667f103897d81382e26064f28a4908e88f25955fcabdc0aeb3a"} Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.888291 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.927542 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9565f95f5-pcd9z"] Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.927836 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerName="extract-content" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.927851 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerName="extract-content" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.927861 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d07389-38be-4bdf-960b-98e9e2ce8eb4" containerName="pruner" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.927866 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d07389-38be-4bdf-960b-98e9e2ce8eb4" containerName="pruner" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.927876 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerName="extract-utilities" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.927883 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerName="extract-utilities" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.927891 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82c578a-7570-486c-a9cf-e22e7ed52e9f" containerName="pruner" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.927912 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82c578a-7570-486c-a9cf-e22e7ed52e9f" containerName="pruner" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.927921 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be19f86-17ad-4697-9ff0-d5b7ee06a60d" containerName="oauth-openshift" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.927927 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be19f86-17ad-4697-9ff0-d5b7ee06a60d" containerName="oauth-openshift" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.927936 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468fec97-d137-481e-a61b-9e385a5165a5" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.927942 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="468fec97-d137-481e-a61b-9e385a5165a5" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.927951 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.927969 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.930871 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerName="extract-utilities" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.930887 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerName="extract-utilities" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.930900 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerName="extract-content" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.930907 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerName="extract-content" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.930918 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.930941 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.930951 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5565c983-8814-411e-b913-0ea8e4d73c0f" containerName="collect-profiles" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.930957 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5565c983-8814-411e-b913-0ea8e4d73c0f" containerName="collect-profiles" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.930967 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468fec97-d137-481e-a61b-9e385a5165a5" containerName="extract-content" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.930973 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="468fec97-d137-481e-a61b-9e385a5165a5" containerName="extract-content" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.930981 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.930988 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.930997 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerName="extract-utilities" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931003 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerName="extract-utilities" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.931025 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerName="extract-content" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931031 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerName="extract-content" Oct 12 20:28:19 crc kubenswrapper[4773]: E1012 20:28:19.931040 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468fec97-d137-481e-a61b-9e385a5165a5" containerName="extract-utilities" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931047 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="468fec97-d137-481e-a61b-9e385a5165a5" containerName="extract-utilities" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931231 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ef61d4-c799-4aeb-9f05-4b5202a8abea" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931244 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="da05a3bf-3763-4f69-9392-9ee4204a97c1" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931272 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="468fec97-d137-481e-a61b-9e385a5165a5" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931283 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be19f86-17ad-4697-9ff0-d5b7ee06a60d" containerName="oauth-openshift" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931294 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82c578a-7570-486c-a9cf-e22e7ed52e9f" containerName="pruner" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931303 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="730e4bc4-2374-41d6-8fd9-c870e1931f75" containerName="registry-server" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931312 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d07389-38be-4bdf-960b-98e9e2ce8eb4" containerName="pruner" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931321 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5565c983-8814-411e-b913-0ea8e4d73c0f" containerName="collect-profiles" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.931866 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.950987 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9565f95f5-pcd9z"] Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.982407 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-policies\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.982468 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-ocp-branding-template\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.982502 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-service-ca\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.982532 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-session\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.982564 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-idp-0-file-data\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.982605 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-serving-cert\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.982630 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-trusted-ca-bundle\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.982712 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-provider-selection\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.982798 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf5gr\" (UniqueName: \"kubernetes.io/projected/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-kube-api-access-mf5gr\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.983575 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.983610 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.983759 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-router-certs\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.983820 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-dir\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.983857 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-login\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.983893 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-cliconfig\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.983924 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-error\") pod \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\" (UID: \"7be19f86-17ad-4697-9ff0-d5b7ee06a60d\") " Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-template-login\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984221 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-audit-policies\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984262 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-service-ca\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984287 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984290 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984324 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-audit-dir\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984351 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-router-certs\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984379 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-session\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984483 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984512 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984544 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85r4c\" (UniqueName: \"kubernetes.io/projected/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-kube-api-access-85r4c\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984578 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984601 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-template-error\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984654 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984752 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984769 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984783 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.984910 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.988951 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-kube-api-access-mf5gr" (OuterVolumeSpecName: "kube-api-access-mf5gr") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "kube-api-access-mf5gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.989183 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.989772 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.996897 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.997110 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.997363 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.997649 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.997672 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.998038 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:28:19 crc kubenswrapper[4773]: I1012 20:28:19.998152 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7be19f86-17ad-4697-9ff0-d5b7ee06a60d" (UID: "7be19f86-17ad-4697-9ff0-d5b7ee06a60d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.086612 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.086694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-template-login\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.086773 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-audit-policies\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.086824 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-service-ca\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.086865 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.086904 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-audit-dir\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.086940 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-router-certs\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.086971 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-session\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087071 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087107 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85r4c\" (UniqueName: \"kubernetes.io/projected/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-kube-api-access-85r4c\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087147 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-template-error\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087308 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087330 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087351 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087373 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087395 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087414 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087433 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087454 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087474 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf5gr\" (UniqueName: \"kubernetes.io/projected/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-kube-api-access-mf5gr\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087494 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087514 4773 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7be19f86-17ad-4697-9ff0-d5b7ee06a60d-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087691 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-service-ca\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-audit-policies\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.087803 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-audit-dir\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.088403 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.089372 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.091940 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-template-login\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.092338 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.093183 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.093544 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-router-certs\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.093991 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-session\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.094365 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.095276 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-user-template-error\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.095522 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.104740 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85r4c\" (UniqueName: \"kubernetes.io/projected/212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3-kube-api-access-85r4c\") pod \"oauth-openshift-9565f95f5-pcd9z\" (UID: \"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.259791 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.657920 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" event={"ID":"7be19f86-17ad-4697-9ff0-d5b7ee06a60d","Type":"ContainerDied","Data":"c74e8c5b41f1311eb7b129f18622e527830f9cea4aae009aeb53010a5dd12311"} Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.658863 4773 scope.go:117] "RemoveContainer" containerID="a8b2f60fa9881667f103897d81382e26064f28a4908e88f25955fcabdc0aeb3a" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.658103 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7jq2n" Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.693540 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7jq2n"] Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.696236 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7jq2n"] Oct 12 20:28:20 crc kubenswrapper[4773]: I1012 20:28:20.713493 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9565f95f5-pcd9z"] Oct 12 20:28:21 crc kubenswrapper[4773]: I1012 20:28:21.670758 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" event={"ID":"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3","Type":"ContainerStarted","Data":"248b366054c28211af366515b4cc54d92617ae6af9ff4957311136f1bbdac5ab"} Oct 12 20:28:21 crc kubenswrapper[4773]: I1012 20:28:21.671245 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" event={"ID":"212dad9b-d48b-4e8d-b5fe-ff20ad6de7b3","Type":"ContainerStarted","Data":"8e574b15a0febcc3234ebea064b180c65da35c5c3fd235dbcad1bd14b9a5af45"} Oct 12 20:28:21 crc kubenswrapper[4773]: I1012 20:28:21.671303 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:21 crc kubenswrapper[4773]: I1012 20:28:21.678333 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" Oct 12 20:28:21 crc kubenswrapper[4773]: I1012 20:28:21.707402 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9565f95f5-pcd9z" podStartSLOduration=27.707374444 podStartE2EDuration="27.707374444s" podCreationTimestamp="2025-10-12 20:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:28:21.702698651 +0000 UTC m=+249.938997251" watchObservedRunningTime="2025-10-12 20:28:21.707374444 +0000 UTC m=+249.943673034" Oct 12 20:28:22 crc kubenswrapper[4773]: I1012 20:28:22.496036 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be19f86-17ad-4697-9ff0-d5b7ee06a60d" path="/var/lib/kubelet/pods/7be19f86-17ad-4697-9ff0-d5b7ee06a60d/volumes" Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.697297 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5t25"] Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.698492 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x5t25" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" containerName="registry-server" containerID="cri-o://0c401204b88942c2bd5e4bf95cb130807d9b43a34745f4c4826b28cdb59d3a9d" gracePeriod=30 Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.712610 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rx9db"] Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.713244 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rx9db" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerName="registry-server" containerID="cri-o://b4adc26dd0d2a5c2a5aa920312024c1e5c8a8d02b483bf04f288263d89bb7cde" gracePeriod=30 Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.726640 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nx5ql"] Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.727089 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" podUID="cb6140cb-0bb4-4fe5-bc14-85ac2e640334" containerName="marketplace-operator" containerID="cri-o://7375183bc84201c2724c425a6ab273ea033e7339c26d84be204445d263da6728" gracePeriod=30 Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.740906 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqntd"] Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.741482 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pqntd" podUID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerName="registry-server" containerID="cri-o://77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146" gracePeriod=30 Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.762945 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2v2pc"] Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.763995 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.765466 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgdhf"] Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.765746 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vgdhf" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerName="registry-server" containerID="cri-o://dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9" gracePeriod=30 Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.781934 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2v2pc"] Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.863939 4773 generic.go:334] "Generic (PLEG): container finished" podID="b5b41529-16fa-43a7-a245-34ca5e013832" containerID="0c401204b88942c2bd5e4bf95cb130807d9b43a34745f4c4826b28cdb59d3a9d" exitCode=0 Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.864194 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5t25" event={"ID":"b5b41529-16fa-43a7-a245-34ca5e013832","Type":"ContainerDied","Data":"0c401204b88942c2bd5e4bf95cb130807d9b43a34745f4c4826b28cdb59d3a9d"} Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.878703 4773 generic.go:334] "Generic (PLEG): container finished" podID="cb6140cb-0bb4-4fe5-bc14-85ac2e640334" containerID="7375183bc84201c2724c425a6ab273ea033e7339c26d84be204445d263da6728" exitCode=0 Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.878882 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" event={"ID":"cb6140cb-0bb4-4fe5-bc14-85ac2e640334","Type":"ContainerDied","Data":"7375183bc84201c2724c425a6ab273ea033e7339c26d84be204445d263da6728"} Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.900731 4773 generic.go:334] "Generic (PLEG): container finished" podID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerID="b4adc26dd0d2a5c2a5aa920312024c1e5c8a8d02b483bf04f288263d89bb7cde" exitCode=0 Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.900797 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx9db" event={"ID":"6b6ebc1e-7018-4d16-a56b-a962f165de10","Type":"ContainerDied","Data":"b4adc26dd0d2a5c2a5aa920312024c1e5c8a8d02b483bf04f288263d89bb7cde"} Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.945499 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3f993aa7-e2c9-41bb-96ba-0b4e0682c92a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2v2pc\" (UID: \"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.945566 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f993aa7-e2c9-41bb-96ba-0b4e0682c92a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2v2pc\" (UID: \"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:47 crc kubenswrapper[4773]: I1012 20:28:47.945592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pgv6\" (UniqueName: \"kubernetes.io/projected/3f993aa7-e2c9-41bb-96ba-0b4e0682c92a-kube-api-access-6pgv6\") pod \"marketplace-operator-79b997595-2v2pc\" (UID: \"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.047010 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f993aa7-e2c9-41bb-96ba-0b4e0682c92a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2v2pc\" (UID: \"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.047078 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pgv6\" (UniqueName: \"kubernetes.io/projected/3f993aa7-e2c9-41bb-96ba-0b4e0682c92a-kube-api-access-6pgv6\") pod \"marketplace-operator-79b997595-2v2pc\" (UID: \"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.047147 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3f993aa7-e2c9-41bb-96ba-0b4e0682c92a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2v2pc\" (UID: \"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.048845 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f993aa7-e2c9-41bb-96ba-0b4e0682c92a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2v2pc\" (UID: \"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.055078 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3f993aa7-e2c9-41bb-96ba-0b4e0682c92a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2v2pc\" (UID: \"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.075132 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pgv6\" (UniqueName: \"kubernetes.io/projected/3f993aa7-e2c9-41bb-96ba-0b4e0682c92a-kube-api-access-6pgv6\") pod \"marketplace-operator-79b997595-2v2pc\" (UID: \"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a\") " pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.088480 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.149473 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.267265 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.292972 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.303698 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.327088 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.350411 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-catalog-content\") pod \"b5b41529-16fa-43a7-a245-34ca5e013832\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.350560 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfwhf\" (UniqueName: \"kubernetes.io/projected/b5b41529-16fa-43a7-a245-34ca5e013832-kube-api-access-jfwhf\") pod \"b5b41529-16fa-43a7-a245-34ca5e013832\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.350810 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-utilities\") pod \"b5b41529-16fa-43a7-a245-34ca5e013832\" (UID: \"b5b41529-16fa-43a7-a245-34ca5e013832\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.352218 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-utilities" (OuterVolumeSpecName: "utilities") pod "b5b41529-16fa-43a7-a245-34ca5e013832" (UID: "b5b41529-16fa-43a7-a245-34ca5e013832"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.361965 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b41529-16fa-43a7-a245-34ca5e013832-kube-api-access-jfwhf" (OuterVolumeSpecName: "kube-api-access-jfwhf") pod "b5b41529-16fa-43a7-a245-34ca5e013832" (UID: "b5b41529-16fa-43a7-a245-34ca5e013832"). InnerVolumeSpecName "kube-api-access-jfwhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452396 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm2bd\" (UniqueName: \"kubernetes.io/projected/6b6ebc1e-7018-4d16-a56b-a962f165de10-kube-api-access-wm2bd\") pod \"6b6ebc1e-7018-4d16-a56b-a962f165de10\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452436 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p95xv\" (UniqueName: \"kubernetes.io/projected/4d8bb41c-c5fb-48de-b561-8b2473147603-kube-api-access-p95xv\") pod \"4d8bb41c-c5fb-48de-b561-8b2473147603\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452463 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-catalog-content\") pod \"6b6ebc1e-7018-4d16-a56b-a962f165de10\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452485 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-utilities\") pod \"0da0fc67-a78b-402d-ba7a-24209fea4258\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452528 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfhhr\" (UniqueName: \"kubernetes.io/projected/0da0fc67-a78b-402d-ba7a-24209fea4258-kube-api-access-sfhhr\") pod \"0da0fc67-a78b-402d-ba7a-24209fea4258\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452573 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-catalog-content\") pod \"0da0fc67-a78b-402d-ba7a-24209fea4258\" (UID: \"0da0fc67-a78b-402d-ba7a-24209fea4258\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452614 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-catalog-content\") pod \"4d8bb41c-c5fb-48de-b561-8b2473147603\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452634 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-trusted-ca\") pod \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452666 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-utilities\") pod \"4d8bb41c-c5fb-48de-b561-8b2473147603\" (UID: \"4d8bb41c-c5fb-48de-b561-8b2473147603\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452699 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-operator-metrics\") pod \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452731 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-utilities\") pod \"6b6ebc1e-7018-4d16-a56b-a962f165de10\" (UID: \"6b6ebc1e-7018-4d16-a56b-a962f165de10\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452767 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsw7q\" (UniqueName: \"kubernetes.io/projected/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-kube-api-access-qsw7q\") pod \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\" (UID: \"cb6140cb-0bb4-4fe5-bc14-85ac2e640334\") " Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452982 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfwhf\" (UniqueName: \"kubernetes.io/projected/b5b41529-16fa-43a7-a245-34ca5e013832-kube-api-access-jfwhf\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.452994 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.455412 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-utilities" (OuterVolumeSpecName: "utilities") pod "6b6ebc1e-7018-4d16-a56b-a962f165de10" (UID: "6b6ebc1e-7018-4d16-a56b-a962f165de10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.455515 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-utilities" (OuterVolumeSpecName: "utilities") pod "0da0fc67-a78b-402d-ba7a-24209fea4258" (UID: "0da0fc67-a78b-402d-ba7a-24209fea4258"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.458802 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-utilities" (OuterVolumeSpecName: "utilities") pod "4d8bb41c-c5fb-48de-b561-8b2473147603" (UID: "4d8bb41c-c5fb-48de-b561-8b2473147603"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.458619 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cb6140cb-0bb4-4fe5-bc14-85ac2e640334" (UID: "cb6140cb-0bb4-4fe5-bc14-85ac2e640334"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.462333 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5b41529-16fa-43a7-a245-34ca5e013832" (UID: "b5b41529-16fa-43a7-a245-34ca5e013832"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.464860 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-kube-api-access-qsw7q" (OuterVolumeSpecName: "kube-api-access-qsw7q") pod "cb6140cb-0bb4-4fe5-bc14-85ac2e640334" (UID: "cb6140cb-0bb4-4fe5-bc14-85ac2e640334"). InnerVolumeSpecName "kube-api-access-qsw7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.465506 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6ebc1e-7018-4d16-a56b-a962f165de10-kube-api-access-wm2bd" (OuterVolumeSpecName: "kube-api-access-wm2bd") pod "6b6ebc1e-7018-4d16-a56b-a962f165de10" (UID: "6b6ebc1e-7018-4d16-a56b-a962f165de10"). InnerVolumeSpecName "kube-api-access-wm2bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.465645 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da0fc67-a78b-402d-ba7a-24209fea4258-kube-api-access-sfhhr" (OuterVolumeSpecName: "kube-api-access-sfhhr") pod "0da0fc67-a78b-402d-ba7a-24209fea4258" (UID: "0da0fc67-a78b-402d-ba7a-24209fea4258"). InnerVolumeSpecName "kube-api-access-sfhhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.466002 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8bb41c-c5fb-48de-b561-8b2473147603-kube-api-access-p95xv" (OuterVolumeSpecName: "kube-api-access-p95xv") pod "4d8bb41c-c5fb-48de-b561-8b2473147603" (UID: "4d8bb41c-c5fb-48de-b561-8b2473147603"). InnerVolumeSpecName "kube-api-access-p95xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.474404 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cb6140cb-0bb4-4fe5-bc14-85ac2e640334" (UID: "cb6140cb-0bb4-4fe5-bc14-85ac2e640334"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.495986 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d8bb41c-c5fb-48de-b561-8b2473147603" (UID: "4d8bb41c-c5fb-48de-b561-8b2473147603"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.530261 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b6ebc1e-7018-4d16-a56b-a962f165de10" (UID: "6b6ebc1e-7018-4d16-a56b-a962f165de10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.545944 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0da0fc67-a78b-402d-ba7a-24209fea4258" (UID: "0da0fc67-a78b-402d-ba7a-24209fea4258"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554764 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554790 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554809 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554818 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsw7q\" (UniqueName: \"kubernetes.io/projected/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-kube-api-access-qsw7q\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554827 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm2bd\" (UniqueName: \"kubernetes.io/projected/6b6ebc1e-7018-4d16-a56b-a962f165de10-kube-api-access-wm2bd\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554836 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p95xv\" (UniqueName: \"kubernetes.io/projected/4d8bb41c-c5fb-48de-b561-8b2473147603-kube-api-access-p95xv\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554844 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6ebc1e-7018-4d16-a56b-a962f165de10-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554854 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554863 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfhhr\" (UniqueName: \"kubernetes.io/projected/0da0fc67-a78b-402d-ba7a-24209fea4258-kube-api-access-sfhhr\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554873 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b41529-16fa-43a7-a245-34ca5e013832-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554882 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0fc67-a78b-402d-ba7a-24209fea4258-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554890 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8bb41c-c5fb-48de-b561-8b2473147603-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.554898 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6140cb-0bb4-4fe5-bc14-85ac2e640334-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.659297 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2v2pc"] Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.923702 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rx9db" event={"ID":"6b6ebc1e-7018-4d16-a56b-a962f165de10","Type":"ContainerDied","Data":"b60a0a4383009859030f1599d018fcd4e41d6d5acae2a16d089e5dabd17f77a1"} Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.923767 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rx9db" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.923774 4773 scope.go:117] "RemoveContainer" containerID="b4adc26dd0d2a5c2a5aa920312024c1e5c8a8d02b483bf04f288263d89bb7cde" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.929347 4773 generic.go:334] "Generic (PLEG): container finished" podID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerID="dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9" exitCode=0 Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.929391 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vgdhf" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.929430 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgdhf" event={"ID":"0da0fc67-a78b-402d-ba7a-24209fea4258","Type":"ContainerDied","Data":"dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9"} Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.929459 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vgdhf" event={"ID":"0da0fc67-a78b-402d-ba7a-24209fea4258","Type":"ContainerDied","Data":"3db308981d9df8d391dc87ff956c09b894ab615eb1894721268d3098d1d38ee9"} Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.932644 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" event={"ID":"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a","Type":"ContainerStarted","Data":"1c04a38f493722479a8a94a59d03d6bfdbacdb20d6bcfd7bfb32d12524511889"} Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.932747 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" event={"ID":"3f993aa7-e2c9-41bb-96ba-0b4e0682c92a","Type":"ContainerStarted","Data":"74ddb322b8ba25785e016bce152217dad38bea5d0df404c12a01457d9ef5ba19"} Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.934004 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.941798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5t25" event={"ID":"b5b41529-16fa-43a7-a245-34ca5e013832","Type":"ContainerDied","Data":"7639d4ffe4ec13e5174aabe0dce14cd661d43c0d89fe9007cd8aeabdbe237440"} Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.941896 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5t25" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.942513 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2v2pc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.942564 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" podUID="3f993aa7-e2c9-41bb-96ba-0b4e0682c92a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.945223 4773 scope.go:117] "RemoveContainer" containerID="e24e3bee0b0104b29862a08fa1b194dcad514dd1f4cbdb378dfb82543927ac84" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.959427 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerID="77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146" exitCode=0 Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.959987 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqntd" event={"ID":"4d8bb41c-c5fb-48de-b561-8b2473147603","Type":"ContainerDied","Data":"77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146"} Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.959941 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" podStartSLOduration=1.959922264 podStartE2EDuration="1.959922264s" podCreationTimestamp="2025-10-12 20:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:28:48.947825891 +0000 UTC m=+277.184124451" watchObservedRunningTime="2025-10-12 20:28:48.959922264 +0000 UTC m=+277.196220824" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.960020 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqntd" event={"ID":"4d8bb41c-c5fb-48de-b561-8b2473147603","Type":"ContainerDied","Data":"b29d5099615fa0dcd8bb44a3f72d72891097ea7c6552f507171cef55db667c37"} Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.960676 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqntd" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.966835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" event={"ID":"cb6140cb-0bb4-4fe5-bc14-85ac2e640334","Type":"ContainerDied","Data":"b7b941bbaeeeb56acb39911424cda5b78b09f6d20c31b0b61007bb3bbf7ad925"} Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.967075 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nx5ql" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.973840 4773 scope.go:117] "RemoveContainer" containerID="45ff9250c99b9ee894f62afcf9fd5877934581632cc4f6c89a5808227c4ff8c1" Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.994685 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rx9db"] Oct 12 20:28:48 crc kubenswrapper[4773]: I1012 20:28:48.996392 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rx9db"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.013090 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vgdhf"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.016689 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vgdhf"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.029870 4773 scope.go:117] "RemoveContainer" containerID="dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.033779 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5t25"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.036021 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x5t25"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.049586 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqntd"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.050895 4773 scope.go:117] "RemoveContainer" containerID="ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.060863 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqntd"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.066049 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nx5ql"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.067017 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nx5ql"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.067862 4773 scope.go:117] "RemoveContainer" containerID="48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.084899 4773 scope.go:117] "RemoveContainer" containerID="dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.085383 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9\": container with ID starting with dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9 not found: ID does not exist" containerID="dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.085414 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9"} err="failed to get container status \"dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9\": rpc error: code = NotFound desc = could not find container \"dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9\": container with ID starting with dac7e9d1934ae59f84969e6c6c96917d76584f5a64361c9545ad1d5164f513b9 not found: ID does not exist" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.085442 4773 scope.go:117] "RemoveContainer" containerID="ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.085834 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c\": container with ID starting with ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c not found: ID does not exist" containerID="ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.085860 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c"} err="failed to get container status \"ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c\": rpc error: code = NotFound desc = could not find container \"ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c\": container with ID starting with ba164317bd7da55c1b4576df3ebf25c06e479b119b375193d645dfa4045c1d2c not found: ID does not exist" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.085874 4773 scope.go:117] "RemoveContainer" containerID="48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.086842 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c\": container with ID starting with 48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c not found: ID does not exist" containerID="48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.086866 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c"} err="failed to get container status \"48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c\": rpc error: code = NotFound desc = could not find container \"48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c\": container with ID starting with 48626fd1dfc3e0f0b47f2e739bc104de836d133507a0d9b5e115d190f9e62b2c not found: ID does not exist" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.086881 4773 scope.go:117] "RemoveContainer" containerID="0c401204b88942c2bd5e4bf95cb130807d9b43a34745f4c4826b28cdb59d3a9d" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.104118 4773 scope.go:117] "RemoveContainer" containerID="e1e4f602c4e10b92a7928f99936673eb795dd71e4946217238eec9aacebc4b49" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.119156 4773 scope.go:117] "RemoveContainer" containerID="7e9d5cee17eb7b46cdbceceb01ea7083d95147acabbef24b7ba5231aa9f9e08d" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.130705 4773 scope.go:117] "RemoveContainer" containerID="77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.145868 4773 scope.go:117] "RemoveContainer" containerID="9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.159222 4773 scope.go:117] "RemoveContainer" containerID="665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.171314 4773 scope.go:117] "RemoveContainer" containerID="77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.171913 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146\": container with ID starting with 77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146 not found: ID does not exist" containerID="77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.171965 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146"} err="failed to get container status \"77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146\": rpc error: code = NotFound desc = could not find container \"77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146\": container with ID starting with 77a266fe0e16da953f38988dd31e57057c359ea3ef4ac5d10ad23cb2a721b146 not found: ID does not exist" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.172001 4773 scope.go:117] "RemoveContainer" containerID="9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.172465 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4\": container with ID starting with 9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4 not found: ID does not exist" containerID="9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.172511 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4"} err="failed to get container status \"9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4\": rpc error: code = NotFound desc = could not find container \"9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4\": container with ID starting with 9bd6d0104b1054e6dadaae54dd199f61545dfca3dbe2a7a00132443437a067b4 not found: ID does not exist" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.172549 4773 scope.go:117] "RemoveContainer" containerID="665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.173078 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b\": container with ID starting with 665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b not found: ID does not exist" containerID="665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.173112 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b"} err="failed to get container status \"665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b\": rpc error: code = NotFound desc = could not find container \"665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b\": container with ID starting with 665108d131c51a4041abea76df485ac34eba3ad70a74ec6017ff3694aaf6413b not found: ID does not exist" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.173129 4773 scope.go:117] "RemoveContainer" containerID="7375183bc84201c2724c425a6ab273ea033e7339c26d84be204445d263da6728" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916146 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mgmbc"] Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916360 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerName="extract-utilities" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916374 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerName="extract-utilities" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916383 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916390 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916400 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerName="extract-utilities" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916407 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerName="extract-utilities" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916419 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" containerName="extract-content" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916427 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" containerName="extract-content" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916451 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916457 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916471 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerName="extract-content" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916477 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerName="extract-content" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916485 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916490 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916498 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916505 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916512 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" containerName="extract-utilities" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916518 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" containerName="extract-utilities" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916527 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerName="extract-utilities" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916532 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerName="extract-utilities" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916539 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6140cb-0bb4-4fe5-bc14-85ac2e640334" containerName="marketplace-operator" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916545 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6140cb-0bb4-4fe5-bc14-85ac2e640334" containerName="marketplace-operator" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916555 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerName="extract-content" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916560 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerName="extract-content" Oct 12 20:28:49 crc kubenswrapper[4773]: E1012 20:28:49.916567 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerName="extract-content" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916573 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerName="extract-content" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916665 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916674 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916683 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6140cb-0bb4-4fe5-bc14-85ac2e640334" containerName="marketplace-operator" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916691 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.916699 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8bb41c-c5fb-48de-b561-8b2473147603" containerName="registry-server" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.917368 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.920447 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.926561 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgmbc"] Oct 12 20:28:49 crc kubenswrapper[4773]: I1012 20:28:49.983054 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2v2pc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.091161 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bef1d5c-305a-457b-8d9d-d22b1d65d077-catalog-content\") pod \"redhat-marketplace-mgmbc\" (UID: \"5bef1d5c-305a-457b-8d9d-d22b1d65d077\") " pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.091288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bef1d5c-305a-457b-8d9d-d22b1d65d077-utilities\") pod \"redhat-marketplace-mgmbc\" (UID: \"5bef1d5c-305a-457b-8d9d-d22b1d65d077\") " pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.091820 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsvd9\" (UniqueName: \"kubernetes.io/projected/5bef1d5c-305a-457b-8d9d-d22b1d65d077-kube-api-access-rsvd9\") pod \"redhat-marketplace-mgmbc\" (UID: \"5bef1d5c-305a-457b-8d9d-d22b1d65d077\") " pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.112806 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bnf6k"] Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.113941 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.123625 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.134388 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bnf6k"] Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.192986 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bef1d5c-305a-457b-8d9d-d22b1d65d077-utilities\") pod \"redhat-marketplace-mgmbc\" (UID: \"5bef1d5c-305a-457b-8d9d-d22b1d65d077\") " pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.193051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsvd9\" (UniqueName: \"kubernetes.io/projected/5bef1d5c-305a-457b-8d9d-d22b1d65d077-kube-api-access-rsvd9\") pod \"redhat-marketplace-mgmbc\" (UID: \"5bef1d5c-305a-457b-8d9d-d22b1d65d077\") " pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.193098 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bef1d5c-305a-457b-8d9d-d22b1d65d077-catalog-content\") pod \"redhat-marketplace-mgmbc\" (UID: \"5bef1d5c-305a-457b-8d9d-d22b1d65d077\") " pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.193598 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bef1d5c-305a-457b-8d9d-d22b1d65d077-catalog-content\") pod \"redhat-marketplace-mgmbc\" (UID: \"5bef1d5c-305a-457b-8d9d-d22b1d65d077\") " pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.193859 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bef1d5c-305a-457b-8d9d-d22b1d65d077-utilities\") pod \"redhat-marketplace-mgmbc\" (UID: \"5bef1d5c-305a-457b-8d9d-d22b1d65d077\") " pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.216225 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsvd9\" (UniqueName: \"kubernetes.io/projected/5bef1d5c-305a-457b-8d9d-d22b1d65d077-kube-api-access-rsvd9\") pod \"redhat-marketplace-mgmbc\" (UID: \"5bef1d5c-305a-457b-8d9d-d22b1d65d077\") " pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.240390 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.294616 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645a380b-fa47-45e2-a370-164d09e3a646-catalog-content\") pod \"certified-operators-bnf6k\" (UID: \"645a380b-fa47-45e2-a370-164d09e3a646\") " pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.294679 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645a380b-fa47-45e2-a370-164d09e3a646-utilities\") pod \"certified-operators-bnf6k\" (UID: \"645a380b-fa47-45e2-a370-164d09e3a646\") " pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.294705 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dscdh\" (UniqueName: \"kubernetes.io/projected/645a380b-fa47-45e2-a370-164d09e3a646-kube-api-access-dscdh\") pod \"certified-operators-bnf6k\" (UID: \"645a380b-fa47-45e2-a370-164d09e3a646\") " pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.396815 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645a380b-fa47-45e2-a370-164d09e3a646-catalog-content\") pod \"certified-operators-bnf6k\" (UID: \"645a380b-fa47-45e2-a370-164d09e3a646\") " pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.397286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645a380b-fa47-45e2-a370-164d09e3a646-utilities\") pod \"certified-operators-bnf6k\" (UID: \"645a380b-fa47-45e2-a370-164d09e3a646\") " pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.397313 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dscdh\" (UniqueName: \"kubernetes.io/projected/645a380b-fa47-45e2-a370-164d09e3a646-kube-api-access-dscdh\") pod \"certified-operators-bnf6k\" (UID: \"645a380b-fa47-45e2-a370-164d09e3a646\") " pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.397798 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645a380b-fa47-45e2-a370-164d09e3a646-catalog-content\") pod \"certified-operators-bnf6k\" (UID: \"645a380b-fa47-45e2-a370-164d09e3a646\") " pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.398103 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645a380b-fa47-45e2-a370-164d09e3a646-utilities\") pod \"certified-operators-bnf6k\" (UID: \"645a380b-fa47-45e2-a370-164d09e3a646\") " pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.421868 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dscdh\" (UniqueName: \"kubernetes.io/projected/645a380b-fa47-45e2-a370-164d09e3a646-kube-api-access-dscdh\") pod \"certified-operators-bnf6k\" (UID: \"645a380b-fa47-45e2-a370-164d09e3a646\") " pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.432473 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.489240 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da0fc67-a78b-402d-ba7a-24209fea4258" path="/var/lib/kubelet/pods/0da0fc67-a78b-402d-ba7a-24209fea4258/volumes" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.489812 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8bb41c-c5fb-48de-b561-8b2473147603" path="/var/lib/kubelet/pods/4d8bb41c-c5fb-48de-b561-8b2473147603/volumes" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.490363 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6ebc1e-7018-4d16-a56b-a962f165de10" path="/var/lib/kubelet/pods/6b6ebc1e-7018-4d16-a56b-a962f165de10/volumes" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.491332 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b41529-16fa-43a7-a245-34ca5e013832" path="/var/lib/kubelet/pods/b5b41529-16fa-43a7-a245-34ca5e013832/volumes" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.491935 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb6140cb-0bb4-4fe5-bc14-85ac2e640334" path="/var/lib/kubelet/pods/cb6140cb-0bb4-4fe5-bc14-85ac2e640334/volumes" Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.624660 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bnf6k"] Oct 12 20:28:50 crc kubenswrapper[4773]: W1012 20:28:50.634387 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod645a380b_fa47_45e2_a370_164d09e3a646.slice/crio-c0d715d715cb126037dec3560f8d79fcbf6032d1d44c2e69d9344dba10015593 WatchSource:0}: Error finding container c0d715d715cb126037dec3560f8d79fcbf6032d1d44c2e69d9344dba10015593: Status 404 returned error can't find the container with id c0d715d715cb126037dec3560f8d79fcbf6032d1d44c2e69d9344dba10015593 Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.653272 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgmbc"] Oct 12 20:28:50 crc kubenswrapper[4773]: W1012 20:28:50.662618 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bef1d5c_305a_457b_8d9d_d22b1d65d077.slice/crio-c0bb9195ec471edbcf624d5b1153733a28bae811bb3acd217002e9719a309bed WatchSource:0}: Error finding container c0bb9195ec471edbcf624d5b1153733a28bae811bb3acd217002e9719a309bed: Status 404 returned error can't find the container with id c0bb9195ec471edbcf624d5b1153733a28bae811bb3acd217002e9719a309bed Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.986159 4773 generic.go:334] "Generic (PLEG): container finished" podID="5bef1d5c-305a-457b-8d9d-d22b1d65d077" containerID="3b8f3b824fb69aa4c6ec4722f4da54ba6d5e4ae227eefd3dac022fb544105a4b" exitCode=0 Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.986343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgmbc" event={"ID":"5bef1d5c-305a-457b-8d9d-d22b1d65d077","Type":"ContainerDied","Data":"3b8f3b824fb69aa4c6ec4722f4da54ba6d5e4ae227eefd3dac022fb544105a4b"} Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.986469 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgmbc" event={"ID":"5bef1d5c-305a-457b-8d9d-d22b1d65d077","Type":"ContainerStarted","Data":"c0bb9195ec471edbcf624d5b1153733a28bae811bb3acd217002e9719a309bed"} Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.990497 4773 generic.go:334] "Generic (PLEG): container finished" podID="645a380b-fa47-45e2-a370-164d09e3a646" containerID="09f8073fb8fabae1f7fe8c601d757d7d99e1caddd1a48c88fcb7b5a75716185f" exitCode=0 Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.991426 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnf6k" event={"ID":"645a380b-fa47-45e2-a370-164d09e3a646","Type":"ContainerDied","Data":"09f8073fb8fabae1f7fe8c601d757d7d99e1caddd1a48c88fcb7b5a75716185f"} Oct 12 20:28:50 crc kubenswrapper[4773]: I1012 20:28:50.991452 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnf6k" event={"ID":"645a380b-fa47-45e2-a370-164d09e3a646","Type":"ContainerStarted","Data":"c0d715d715cb126037dec3560f8d79fcbf6032d1d44c2e69d9344dba10015593"} Oct 12 20:28:51 crc kubenswrapper[4773]: I1012 20:28:51.997577 4773 generic.go:334] "Generic (PLEG): container finished" podID="645a380b-fa47-45e2-a370-164d09e3a646" containerID="0d7bc6f055b32f0232e2a900f11c44b248b1ba051bf9513d677af11612e1be11" exitCode=0 Oct 12 20:28:51 crc kubenswrapper[4773]: I1012 20:28:51.998430 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnf6k" event={"ID":"645a380b-fa47-45e2-a370-164d09e3a646","Type":"ContainerDied","Data":"0d7bc6f055b32f0232e2a900f11c44b248b1ba051bf9513d677af11612e1be11"} Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.309125 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j7pqm"] Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.310485 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.315535 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.319838 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7pqm"] Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.419248 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88fd9e23-9895-4ff5-a626-c695ec043315-utilities\") pod \"redhat-operators-j7pqm\" (UID: \"88fd9e23-9895-4ff5-a626-c695ec043315\") " pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.419451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2m9p\" (UniqueName: \"kubernetes.io/projected/88fd9e23-9895-4ff5-a626-c695ec043315-kube-api-access-q2m9p\") pod \"redhat-operators-j7pqm\" (UID: \"88fd9e23-9895-4ff5-a626-c695ec043315\") " pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.419587 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88fd9e23-9895-4ff5-a626-c695ec043315-catalog-content\") pod \"redhat-operators-j7pqm\" (UID: \"88fd9e23-9895-4ff5-a626-c695ec043315\") " pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.520590 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88fd9e23-9895-4ff5-a626-c695ec043315-catalog-content\") pod \"redhat-operators-j7pqm\" (UID: \"88fd9e23-9895-4ff5-a626-c695ec043315\") " pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.520651 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88fd9e23-9895-4ff5-a626-c695ec043315-utilities\") pod \"redhat-operators-j7pqm\" (UID: \"88fd9e23-9895-4ff5-a626-c695ec043315\") " pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.520683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2m9p\" (UniqueName: \"kubernetes.io/projected/88fd9e23-9895-4ff5-a626-c695ec043315-kube-api-access-q2m9p\") pod \"redhat-operators-j7pqm\" (UID: \"88fd9e23-9895-4ff5-a626-c695ec043315\") " pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.521472 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88fd9e23-9895-4ff5-a626-c695ec043315-catalog-content\") pod \"redhat-operators-j7pqm\" (UID: \"88fd9e23-9895-4ff5-a626-c695ec043315\") " pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.521696 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88fd9e23-9895-4ff5-a626-c695ec043315-utilities\") pod \"redhat-operators-j7pqm\" (UID: \"88fd9e23-9895-4ff5-a626-c695ec043315\") " pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.527041 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wr5bp"] Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.528298 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.531954 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.539411 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wr5bp"] Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.559750 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2m9p\" (UniqueName: \"kubernetes.io/projected/88fd9e23-9895-4ff5-a626-c695ec043315-kube-api-access-q2m9p\") pod \"redhat-operators-j7pqm\" (UID: \"88fd9e23-9895-4ff5-a626-c695ec043315\") " pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.622170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7565fd8-a54f-4a89-8162-633feec6e76f-catalog-content\") pod \"community-operators-wr5bp\" (UID: \"b7565fd8-a54f-4a89-8162-633feec6e76f\") " pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.622221 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7565fd8-a54f-4a89-8162-633feec6e76f-utilities\") pod \"community-operators-wr5bp\" (UID: \"b7565fd8-a54f-4a89-8162-633feec6e76f\") " pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.622248 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg2vn\" (UniqueName: \"kubernetes.io/projected/b7565fd8-a54f-4a89-8162-633feec6e76f-kube-api-access-pg2vn\") pod \"community-operators-wr5bp\" (UID: \"b7565fd8-a54f-4a89-8162-633feec6e76f\") " pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.629128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.725044 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7565fd8-a54f-4a89-8162-633feec6e76f-catalog-content\") pod \"community-operators-wr5bp\" (UID: \"b7565fd8-a54f-4a89-8162-633feec6e76f\") " pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.725391 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7565fd8-a54f-4a89-8162-633feec6e76f-utilities\") pod \"community-operators-wr5bp\" (UID: \"b7565fd8-a54f-4a89-8162-633feec6e76f\") " pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.725438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg2vn\" (UniqueName: \"kubernetes.io/projected/b7565fd8-a54f-4a89-8162-633feec6e76f-kube-api-access-pg2vn\") pod \"community-operators-wr5bp\" (UID: \"b7565fd8-a54f-4a89-8162-633feec6e76f\") " pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.725536 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7565fd8-a54f-4a89-8162-633feec6e76f-catalog-content\") pod \"community-operators-wr5bp\" (UID: \"b7565fd8-a54f-4a89-8162-633feec6e76f\") " pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.726040 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7565fd8-a54f-4a89-8162-633feec6e76f-utilities\") pod \"community-operators-wr5bp\" (UID: \"b7565fd8-a54f-4a89-8162-633feec6e76f\") " pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.750052 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg2vn\" (UniqueName: \"kubernetes.io/projected/b7565fd8-a54f-4a89-8162-633feec6e76f-kube-api-access-pg2vn\") pod \"community-operators-wr5bp\" (UID: \"b7565fd8-a54f-4a89-8162-633feec6e76f\") " pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.810906 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7pqm"] Oct 12 20:28:52 crc kubenswrapper[4773]: W1012 20:28:52.826382 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88fd9e23_9895_4ff5_a626_c695ec043315.slice/crio-a408e67bb910deb4816f9ca4874096a9391b4860b2304b50b395e08e9a94690a WatchSource:0}: Error finding container a408e67bb910deb4816f9ca4874096a9391b4860b2304b50b395e08e9a94690a: Status 404 returned error can't find the container with id a408e67bb910deb4816f9ca4874096a9391b4860b2304b50b395e08e9a94690a Oct 12 20:28:52 crc kubenswrapper[4773]: I1012 20:28:52.878042 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:28:53 crc kubenswrapper[4773]: I1012 20:28:53.015555 4773 generic.go:334] "Generic (PLEG): container finished" podID="5bef1d5c-305a-457b-8d9d-d22b1d65d077" containerID="7756edf449f17197638d10fe6b06d3f7bd8adf3b9ce7916f52887d6a01835422" exitCode=0 Oct 12 20:28:53 crc kubenswrapper[4773]: I1012 20:28:53.015634 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgmbc" event={"ID":"5bef1d5c-305a-457b-8d9d-d22b1d65d077","Type":"ContainerDied","Data":"7756edf449f17197638d10fe6b06d3f7bd8adf3b9ce7916f52887d6a01835422"} Oct 12 20:28:53 crc kubenswrapper[4773]: I1012 20:28:53.021878 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnf6k" event={"ID":"645a380b-fa47-45e2-a370-164d09e3a646","Type":"ContainerStarted","Data":"a197b93645ea8fea90277468e2265f531b0a21b60b4fa0055c546f841701ac6b"} Oct 12 20:28:53 crc kubenswrapper[4773]: I1012 20:28:53.028880 4773 generic.go:334] "Generic (PLEG): container finished" podID="88fd9e23-9895-4ff5-a626-c695ec043315" containerID="d6d61fe49c2557643cffc3dc7ed423c194885c0a18310b043c5433bbcad9885b" exitCode=0 Oct 12 20:28:53 crc kubenswrapper[4773]: I1012 20:28:53.028924 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7pqm" event={"ID":"88fd9e23-9895-4ff5-a626-c695ec043315","Type":"ContainerDied","Data":"d6d61fe49c2557643cffc3dc7ed423c194885c0a18310b043c5433bbcad9885b"} Oct 12 20:28:53 crc kubenswrapper[4773]: I1012 20:28:53.028949 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7pqm" event={"ID":"88fd9e23-9895-4ff5-a626-c695ec043315","Type":"ContainerStarted","Data":"a408e67bb910deb4816f9ca4874096a9391b4860b2304b50b395e08e9a94690a"} Oct 12 20:28:53 crc kubenswrapper[4773]: I1012 20:28:53.053652 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bnf6k" podStartSLOduration=1.558289586 podStartE2EDuration="3.053631678s" podCreationTimestamp="2025-10-12 20:28:50 +0000 UTC" firstStartedPulling="2025-10-12 20:28:50.99192207 +0000 UTC m=+279.228220630" lastFinishedPulling="2025-10-12 20:28:52.487264162 +0000 UTC m=+280.723562722" observedRunningTime="2025-10-12 20:28:53.052326921 +0000 UTC m=+281.288625481" watchObservedRunningTime="2025-10-12 20:28:53.053631678 +0000 UTC m=+281.289930238" Oct 12 20:28:53 crc kubenswrapper[4773]: I1012 20:28:53.109048 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wr5bp"] Oct 12 20:28:53 crc kubenswrapper[4773]: W1012 20:28:53.122891 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7565fd8_a54f_4a89_8162_633feec6e76f.slice/crio-710c2a1d7e303c56b40990e774877fcc700b5e6296ea87b0ebe0ec80fc9f198f WatchSource:0}: Error finding container 710c2a1d7e303c56b40990e774877fcc700b5e6296ea87b0ebe0ec80fc9f198f: Status 404 returned error can't find the container with id 710c2a1d7e303c56b40990e774877fcc700b5e6296ea87b0ebe0ec80fc9f198f Oct 12 20:28:54 crc kubenswrapper[4773]: I1012 20:28:54.035288 4773 generic.go:334] "Generic (PLEG): container finished" podID="b7565fd8-a54f-4a89-8162-633feec6e76f" containerID="02de814e5399ea8fe3bcf32af02d130b3fdf578582c71512ff9fce6cf06f1e84" exitCode=0 Oct 12 20:28:54 crc kubenswrapper[4773]: I1012 20:28:54.035353 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr5bp" event={"ID":"b7565fd8-a54f-4a89-8162-633feec6e76f","Type":"ContainerDied","Data":"02de814e5399ea8fe3bcf32af02d130b3fdf578582c71512ff9fce6cf06f1e84"} Oct 12 20:28:54 crc kubenswrapper[4773]: I1012 20:28:54.035987 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr5bp" event={"ID":"b7565fd8-a54f-4a89-8162-633feec6e76f","Type":"ContainerStarted","Data":"710c2a1d7e303c56b40990e774877fcc700b5e6296ea87b0ebe0ec80fc9f198f"} Oct 12 20:28:54 crc kubenswrapper[4773]: I1012 20:28:54.038589 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgmbc" event={"ID":"5bef1d5c-305a-457b-8d9d-d22b1d65d077","Type":"ContainerStarted","Data":"d1961c7d2722b0d1329710af0a10775bcfa15de24d6e393bcbf7c2cc4a0a421e"} Oct 12 20:28:54 crc kubenswrapper[4773]: I1012 20:28:54.041821 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7pqm" event={"ID":"88fd9e23-9895-4ff5-a626-c695ec043315","Type":"ContainerStarted","Data":"bd208b484e831408c4855f897c7d3e5b194df394d128d1cd1228175b2a5396d8"} Oct 12 20:28:54 crc kubenswrapper[4773]: I1012 20:28:54.081344 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mgmbc" podStartSLOduration=2.669602931 podStartE2EDuration="5.081323112s" podCreationTimestamp="2025-10-12 20:28:49 +0000 UTC" firstStartedPulling="2025-10-12 20:28:50.993651159 +0000 UTC m=+279.229949719" lastFinishedPulling="2025-10-12 20:28:53.40537134 +0000 UTC m=+281.641669900" observedRunningTime="2025-10-12 20:28:54.079273704 +0000 UTC m=+282.315572264" watchObservedRunningTime="2025-10-12 20:28:54.081323112 +0000 UTC m=+282.317621672" Oct 12 20:28:55 crc kubenswrapper[4773]: I1012 20:28:55.050451 4773 generic.go:334] "Generic (PLEG): container finished" podID="88fd9e23-9895-4ff5-a626-c695ec043315" containerID="bd208b484e831408c4855f897c7d3e5b194df394d128d1cd1228175b2a5396d8" exitCode=0 Oct 12 20:28:55 crc kubenswrapper[4773]: I1012 20:28:55.050672 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7pqm" event={"ID":"88fd9e23-9895-4ff5-a626-c695ec043315","Type":"ContainerDied","Data":"bd208b484e831408c4855f897c7d3e5b194df394d128d1cd1228175b2a5396d8"} Oct 12 20:28:55 crc kubenswrapper[4773]: I1012 20:28:55.055122 4773 generic.go:334] "Generic (PLEG): container finished" podID="b7565fd8-a54f-4a89-8162-633feec6e76f" containerID="4afc5b3046b245c6e69ea3d336e4950dde3ca0926db38886600323802c245e38" exitCode=0 Oct 12 20:28:55 crc kubenswrapper[4773]: I1012 20:28:55.055244 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr5bp" event={"ID":"b7565fd8-a54f-4a89-8162-633feec6e76f","Type":"ContainerDied","Data":"4afc5b3046b245c6e69ea3d336e4950dde3ca0926db38886600323802c245e38"} Oct 12 20:28:57 crc kubenswrapper[4773]: I1012 20:28:57.072833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7pqm" event={"ID":"88fd9e23-9895-4ff5-a626-c695ec043315","Type":"ContainerStarted","Data":"8e46c80e2c0b8783383afea63a44377167194c98e349e26040c08a7380c82ef7"} Oct 12 20:28:57 crc kubenswrapper[4773]: I1012 20:28:57.075850 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr5bp" event={"ID":"b7565fd8-a54f-4a89-8162-633feec6e76f","Type":"ContainerStarted","Data":"15b6b0705c7e90f471c6e00047f991c7b5479b653201b6dbb4f43abefcbb9cbc"} Oct 12 20:28:57 crc kubenswrapper[4773]: I1012 20:28:57.102071 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j7pqm" podStartSLOduration=2.699544399 podStartE2EDuration="5.102056038s" podCreationTimestamp="2025-10-12 20:28:52 +0000 UTC" firstStartedPulling="2025-10-12 20:28:53.030705328 +0000 UTC m=+281.267003888" lastFinishedPulling="2025-10-12 20:28:55.433216967 +0000 UTC m=+283.669515527" observedRunningTime="2025-10-12 20:28:57.099906327 +0000 UTC m=+285.336204897" watchObservedRunningTime="2025-10-12 20:28:57.102056038 +0000 UTC m=+285.338354598" Oct 12 20:28:57 crc kubenswrapper[4773]: I1012 20:28:57.121512 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wr5bp" podStartSLOduration=3.618322795 podStartE2EDuration="5.12149937s" podCreationTimestamp="2025-10-12 20:28:52 +0000 UTC" firstStartedPulling="2025-10-12 20:28:54.036973494 +0000 UTC m=+282.273272054" lastFinishedPulling="2025-10-12 20:28:55.540150069 +0000 UTC m=+283.776448629" observedRunningTime="2025-10-12 20:28:57.121014056 +0000 UTC m=+285.357312636" watchObservedRunningTime="2025-10-12 20:28:57.12149937 +0000 UTC m=+285.357797930" Oct 12 20:29:00 crc kubenswrapper[4773]: I1012 20:29:00.240763 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:29:00 crc kubenswrapper[4773]: I1012 20:29:00.241697 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:29:00 crc kubenswrapper[4773]: I1012 20:29:00.310256 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:29:00 crc kubenswrapper[4773]: I1012 20:29:00.433534 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:29:00 crc kubenswrapper[4773]: I1012 20:29:00.433624 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:29:00 crc kubenswrapper[4773]: I1012 20:29:00.476696 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:29:01 crc kubenswrapper[4773]: I1012 20:29:01.135381 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bnf6k" Oct 12 20:29:01 crc kubenswrapper[4773]: I1012 20:29:01.151513 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mgmbc" Oct 12 20:29:02 crc kubenswrapper[4773]: I1012 20:29:02.629286 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:29:02 crc kubenswrapper[4773]: I1012 20:29:02.630105 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:29:02 crc kubenswrapper[4773]: I1012 20:29:02.678286 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:29:02 crc kubenswrapper[4773]: I1012 20:29:02.878580 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:29:02 crc kubenswrapper[4773]: I1012 20:29:02.878648 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:29:02 crc kubenswrapper[4773]: I1012 20:29:02.926275 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:29:03 crc kubenswrapper[4773]: I1012 20:29:03.170819 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j7pqm" Oct 12 20:29:03 crc kubenswrapper[4773]: I1012 20:29:03.171852 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wr5bp" Oct 12 20:29:58 crc kubenswrapper[4773]: I1012 20:29:58.670572 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:29:58 crc kubenswrapper[4773]: I1012 20:29:58.671420 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.134882 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng"] Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.136579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.140052 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.140598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.146469 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng"] Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.184960 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-secret-volume\") pod \"collect-profiles-29338350-f46ng\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.185024 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9lvs\" (UniqueName: \"kubernetes.io/projected/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-kube-api-access-g9lvs\") pod \"collect-profiles-29338350-f46ng\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.185045 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-config-volume\") pod \"collect-profiles-29338350-f46ng\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.286659 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-config-volume\") pod \"collect-profiles-29338350-f46ng\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.287154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-secret-volume\") pod \"collect-profiles-29338350-f46ng\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.287220 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9lvs\" (UniqueName: \"kubernetes.io/projected/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-kube-api-access-g9lvs\") pod \"collect-profiles-29338350-f46ng\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.290532 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-config-volume\") pod \"collect-profiles-29338350-f46ng\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.303385 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-secret-volume\") pod \"collect-profiles-29338350-f46ng\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.303435 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9lvs\" (UniqueName: \"kubernetes.io/projected/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-kube-api-access-g9lvs\") pod \"collect-profiles-29338350-f46ng\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.455526 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:00 crc kubenswrapper[4773]: I1012 20:30:00.660673 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng"] Oct 12 20:30:01 crc kubenswrapper[4773]: I1012 20:30:01.459285 4773 generic.go:334] "Generic (PLEG): container finished" podID="da223ee3-ddbf-415a-8ad8-fb78a81fe5a0" containerID="ddccae5297268965e325d6e009816496a5ff282ac904811dfdbacdd2456d845f" exitCode=0 Oct 12 20:30:01 crc kubenswrapper[4773]: I1012 20:30:01.459361 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" event={"ID":"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0","Type":"ContainerDied","Data":"ddccae5297268965e325d6e009816496a5ff282ac904811dfdbacdd2456d845f"} Oct 12 20:30:01 crc kubenswrapper[4773]: I1012 20:30:01.459403 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" event={"ID":"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0","Type":"ContainerStarted","Data":"c0bbae5ba34a8ce66659f22666563459e61df97d24f643d53b8b180d90784e32"} Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.671771 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.724615 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-secret-volume\") pod \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.724727 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9lvs\" (UniqueName: \"kubernetes.io/projected/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-kube-api-access-g9lvs\") pod \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.724795 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-config-volume\") pod \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\" (UID: \"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0\") " Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.726239 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-config-volume" (OuterVolumeSpecName: "config-volume") pod "da223ee3-ddbf-415a-8ad8-fb78a81fe5a0" (UID: "da223ee3-ddbf-415a-8ad8-fb78a81fe5a0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.738241 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-kube-api-access-g9lvs" (OuterVolumeSpecName: "kube-api-access-g9lvs") pod "da223ee3-ddbf-415a-8ad8-fb78a81fe5a0" (UID: "da223ee3-ddbf-415a-8ad8-fb78a81fe5a0"). InnerVolumeSpecName "kube-api-access-g9lvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.746320 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da223ee3-ddbf-415a-8ad8-fb78a81fe5a0" (UID: "da223ee3-ddbf-415a-8ad8-fb78a81fe5a0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.825735 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.826007 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9lvs\" (UniqueName: \"kubernetes.io/projected/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-kube-api-access-g9lvs\") on node \"crc\" DevicePath \"\"" Oct 12 20:30:02 crc kubenswrapper[4773]: I1012 20:30:02.826103 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 20:30:03 crc kubenswrapper[4773]: I1012 20:30:03.472081 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" event={"ID":"da223ee3-ddbf-415a-8ad8-fb78a81fe5a0","Type":"ContainerDied","Data":"c0bbae5ba34a8ce66659f22666563459e61df97d24f643d53b8b180d90784e32"} Oct 12 20:30:03 crc kubenswrapper[4773]: I1012 20:30:03.472167 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng" Oct 12 20:30:03 crc kubenswrapper[4773]: I1012 20:30:03.472185 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0bbae5ba34a8ce66659f22666563459e61df97d24f643d53b8b180d90784e32" Oct 12 20:30:28 crc kubenswrapper[4773]: I1012 20:30:28.669376 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:30:28 crc kubenswrapper[4773]: I1012 20:30:28.670099 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:30:58 crc kubenswrapper[4773]: I1012 20:30:58.670075 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:30:58 crc kubenswrapper[4773]: I1012 20:30:58.671960 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:30:58 crc kubenswrapper[4773]: I1012 20:30:58.672196 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:30:58 crc kubenswrapper[4773]: I1012 20:30:58.673175 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"971ea4f63e4f5f53d552a948f94e7131fae037c3dab2edf2ed90e1f0e80cdf66"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 20:30:58 crc kubenswrapper[4773]: I1012 20:30:58.673450 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://971ea4f63e4f5f53d552a948f94e7131fae037c3dab2edf2ed90e1f0e80cdf66" gracePeriod=600 Oct 12 20:30:58 crc kubenswrapper[4773]: I1012 20:30:58.824829 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="971ea4f63e4f5f53d552a948f94e7131fae037c3dab2edf2ed90e1f0e80cdf66" exitCode=0 Oct 12 20:30:58 crc kubenswrapper[4773]: I1012 20:30:58.824885 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"971ea4f63e4f5f53d552a948f94e7131fae037c3dab2edf2ed90e1f0e80cdf66"} Oct 12 20:30:58 crc kubenswrapper[4773]: I1012 20:30:58.824992 4773 scope.go:117] "RemoveContainer" containerID="d3bb3b1831f0babee86e34e91f8a368cdb311e4aeef4d555ebf3b4f682e3932c" Oct 12 20:30:59 crc kubenswrapper[4773]: I1012 20:30:59.842114 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"50cf730b0a664aa6273b5384482669e8042ad9c84abc280e1e2b88cfe6018b4b"} Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.410542 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7h2vk"] Oct 12 20:32:06 crc kubenswrapper[4773]: E1012 20:32:06.411298 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da223ee3-ddbf-415a-8ad8-fb78a81fe5a0" containerName="collect-profiles" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.411313 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="da223ee3-ddbf-415a-8ad8-fb78a81fe5a0" containerName="collect-profiles" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.411444 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="da223ee3-ddbf-415a-8ad8-fb78a81fe5a0" containerName="collect-profiles" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.411924 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.439148 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7h2vk"] Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.612620 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebbc2bee-4525-4879-b076-ade02bb9a4e9-trusted-ca\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.612667 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebbc2bee-4525-4879-b076-ade02bb9a4e9-bound-sa-token\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.612687 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn5l9\" (UniqueName: \"kubernetes.io/projected/ebbc2bee-4525-4879-b076-ade02bb9a4e9-kube-api-access-kn5l9\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.612729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebbc2bee-4525-4879-b076-ade02bb9a4e9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.612774 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebbc2bee-4525-4879-b076-ade02bb9a4e9-registry-certificates\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.612875 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.613042 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebbc2bee-4525-4879-b076-ade02bb9a4e9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.613134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebbc2bee-4525-4879-b076-ade02bb9a4e9-registry-tls\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.638704 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.714271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebbc2bee-4525-4879-b076-ade02bb9a4e9-trusted-ca\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.714609 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebbc2bee-4525-4879-b076-ade02bb9a4e9-bound-sa-token\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.714687 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn5l9\" (UniqueName: \"kubernetes.io/projected/ebbc2bee-4525-4879-b076-ade02bb9a4e9-kube-api-access-kn5l9\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.714806 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebbc2bee-4525-4879-b076-ade02bb9a4e9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.714931 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebbc2bee-4525-4879-b076-ade02bb9a4e9-registry-certificates\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.715069 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebbc2bee-4525-4879-b076-ade02bb9a4e9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.715164 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebbc2bee-4525-4879-b076-ade02bb9a4e9-registry-tls\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.715407 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebbc2bee-4525-4879-b076-ade02bb9a4e9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.715650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebbc2bee-4525-4879-b076-ade02bb9a4e9-trusted-ca\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.715920 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebbc2bee-4525-4879-b076-ade02bb9a4e9-registry-certificates\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.723000 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebbc2bee-4525-4879-b076-ade02bb9a4e9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.725797 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebbc2bee-4525-4879-b076-ade02bb9a4e9-registry-tls\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.730861 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebbc2bee-4525-4879-b076-ade02bb9a4e9-bound-sa-token\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:06 crc kubenswrapper[4773]: I1012 20:32:06.732308 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn5l9\" (UniqueName: \"kubernetes.io/projected/ebbc2bee-4525-4879-b076-ade02bb9a4e9-kube-api-access-kn5l9\") pod \"image-registry-66df7c8f76-7h2vk\" (UID: \"ebbc2bee-4525-4879-b076-ade02bb9a4e9\") " pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:07 crc kubenswrapper[4773]: I1012 20:32:07.028435 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:07 crc kubenswrapper[4773]: I1012 20:32:07.233643 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7h2vk"] Oct 12 20:32:07 crc kubenswrapper[4773]: I1012 20:32:07.255564 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" event={"ID":"ebbc2bee-4525-4879-b076-ade02bb9a4e9","Type":"ContainerStarted","Data":"524670948f512df5da1fa8a83a01e90d6d236f39507e2a6e41e938abe5bf52b4"} Oct 12 20:32:08 crc kubenswrapper[4773]: I1012 20:32:08.271018 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" event={"ID":"ebbc2bee-4525-4879-b076-ade02bb9a4e9","Type":"ContainerStarted","Data":"3b774549b699aefc2813d75c1770fbce21d2450f766595aac74568981343f164"} Oct 12 20:32:08 crc kubenswrapper[4773]: I1012 20:32:08.271396 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:08 crc kubenswrapper[4773]: I1012 20:32:08.301253 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" podStartSLOduration=2.301225468 podStartE2EDuration="2.301225468s" podCreationTimestamp="2025-10-12 20:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:32:08.299129519 +0000 UTC m=+476.535428089" watchObservedRunningTime="2025-10-12 20:32:08.301225468 +0000 UTC m=+476.537524068" Oct 12 20:32:27 crc kubenswrapper[4773]: I1012 20:32:27.035399 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7h2vk" Oct 12 20:32:27 crc kubenswrapper[4773]: I1012 20:32:27.101339 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4j5xq"] Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.151584 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" podUID="d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" containerName="registry" containerID="cri-o://42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b" gracePeriod=30 Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.544487 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.544622 4773 generic.go:334] "Generic (PLEG): container finished" podID="d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" containerID="42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b" exitCode=0 Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.544642 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" event={"ID":"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c","Type":"ContainerDied","Data":"42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b"} Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.545185 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" event={"ID":"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c","Type":"ContainerDied","Data":"b3e8204b13495af2d35a116d4f40396baca37b3af6e0f8609e4745f480f8e88b"} Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.545240 4773 scope.go:117] "RemoveContainer" containerID="42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.566620 4773 scope.go:117] "RemoveContainer" containerID="42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b" Oct 12 20:32:52 crc kubenswrapper[4773]: E1012 20:32:52.567087 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b\": container with ID starting with 42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b not found: ID does not exist" containerID="42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.567113 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b"} err="failed to get container status \"42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b\": rpc error: code = NotFound desc = could not find container \"42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b\": container with ID starting with 42ab6d6d3555ec8c4fedb029b1461f96a3f5e91f582afd51e0e406d7f61cd20b not found: ID does not exist" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.602883 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-ca-trust-extracted\") pod \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.603118 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.603166 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-tls\") pod \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.603196 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-installation-pull-secrets\") pod \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.603224 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-trusted-ca\") pod \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.603339 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-bound-sa-token\") pod \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.603417 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-certificates\") pod \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.603953 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.604440 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.604824 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8spq\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-kube-api-access-h8spq\") pod \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\" (UID: \"d504f7fe-2dd0-4906-b3e8-dee9ebb9812c\") " Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.606116 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.606142 4773 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.609888 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-kube-api-access-h8spq" (OuterVolumeSpecName: "kube-api-access-h8spq") pod "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c"). InnerVolumeSpecName "kube-api-access-h8spq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.611139 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.611187 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.613451 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.614147 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.618145 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" (UID: "d504f7fe-2dd0-4906-b3e8-dee9ebb9812c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.707509 4773 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.707560 4773 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.707572 4773 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.707587 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 20:32:52 crc kubenswrapper[4773]: I1012 20:32:52.707598 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8spq\" (UniqueName: \"kubernetes.io/projected/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c-kube-api-access-h8spq\") on node \"crc\" DevicePath \"\"" Oct 12 20:32:53 crc kubenswrapper[4773]: I1012 20:32:53.553852 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4j5xq" Oct 12 20:32:53 crc kubenswrapper[4773]: I1012 20:32:53.599255 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4j5xq"] Oct 12 20:32:53 crc kubenswrapper[4773]: I1012 20:32:53.607887 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4j5xq"] Oct 12 20:32:54 crc kubenswrapper[4773]: I1012 20:32:54.498551 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" path="/var/lib/kubelet/pods/d504f7fe-2dd0-4906-b3e8-dee9ebb9812c/volumes" Oct 12 20:32:58 crc kubenswrapper[4773]: I1012 20:32:58.669991 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:32:58 crc kubenswrapper[4773]: I1012 20:32:58.670368 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:33:28 crc kubenswrapper[4773]: I1012 20:33:28.669294 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:33:28 crc kubenswrapper[4773]: I1012 20:33:28.669874 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.657674 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-lshqf"] Oct 12 20:33:54 crc kubenswrapper[4773]: E1012 20:33:54.658377 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" containerName="registry" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.658394 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" containerName="registry" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.658520 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d504f7fe-2dd0-4906-b3e8-dee9ebb9812c" containerName="registry" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.658904 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-lshqf" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.665096 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.665205 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.665347 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-7gvfc" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.670358 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qbzg6"] Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.670957 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qbzg6" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.679447 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-4z9dg" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.680756 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-lshqf"] Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.702769 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qbzg6"] Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.723459 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gh5rv"] Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.724382 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.727179 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gh5rv"] Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.729048 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6wvgj" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.751835 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79859\" (UniqueName: \"kubernetes.io/projected/6003117d-518b-4b81-98ba-01ffbdea09c7-kube-api-access-79859\") pod \"cert-manager-5b446d88c5-qbzg6\" (UID: \"6003117d-518b-4b81-98ba-01ffbdea09c7\") " pod="cert-manager/cert-manager-5b446d88c5-qbzg6" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.751873 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc4kd\" (UniqueName: \"kubernetes.io/projected/5c1610be-cf14-4659-8bf8-46cbcb55aa47-kube-api-access-lc4kd\") pod \"cert-manager-cainjector-7f985d654d-lshqf\" (UID: \"5c1610be-cf14-4659-8bf8-46cbcb55aa47\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-lshqf" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.751932 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv87s\" (UniqueName: \"kubernetes.io/projected/882eaacb-03d9-4250-ab13-b702c4f4b91c-kube-api-access-cv87s\") pod \"cert-manager-webhook-5655c58dd6-gh5rv\" (UID: \"882eaacb-03d9-4250-ab13-b702c4f4b91c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.852642 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79859\" (UniqueName: \"kubernetes.io/projected/6003117d-518b-4b81-98ba-01ffbdea09c7-kube-api-access-79859\") pod \"cert-manager-5b446d88c5-qbzg6\" (UID: \"6003117d-518b-4b81-98ba-01ffbdea09c7\") " pod="cert-manager/cert-manager-5b446d88c5-qbzg6" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.852691 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc4kd\" (UniqueName: \"kubernetes.io/projected/5c1610be-cf14-4659-8bf8-46cbcb55aa47-kube-api-access-lc4kd\") pod \"cert-manager-cainjector-7f985d654d-lshqf\" (UID: \"5c1610be-cf14-4659-8bf8-46cbcb55aa47\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-lshqf" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.852790 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv87s\" (UniqueName: \"kubernetes.io/projected/882eaacb-03d9-4250-ab13-b702c4f4b91c-kube-api-access-cv87s\") pod \"cert-manager-webhook-5655c58dd6-gh5rv\" (UID: \"882eaacb-03d9-4250-ab13-b702c4f4b91c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.870862 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79859\" (UniqueName: \"kubernetes.io/projected/6003117d-518b-4b81-98ba-01ffbdea09c7-kube-api-access-79859\") pod \"cert-manager-5b446d88c5-qbzg6\" (UID: \"6003117d-518b-4b81-98ba-01ffbdea09c7\") " pod="cert-manager/cert-manager-5b446d88c5-qbzg6" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.872478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc4kd\" (UniqueName: \"kubernetes.io/projected/5c1610be-cf14-4659-8bf8-46cbcb55aa47-kube-api-access-lc4kd\") pod \"cert-manager-cainjector-7f985d654d-lshqf\" (UID: \"5c1610be-cf14-4659-8bf8-46cbcb55aa47\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-lshqf" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.873482 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv87s\" (UniqueName: \"kubernetes.io/projected/882eaacb-03d9-4250-ab13-b702c4f4b91c-kube-api-access-cv87s\") pod \"cert-manager-webhook-5655c58dd6-gh5rv\" (UID: \"882eaacb-03d9-4250-ab13-b702c4f4b91c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.978657 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-lshqf" Oct 12 20:33:54 crc kubenswrapper[4773]: I1012 20:33:54.987658 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qbzg6" Oct 12 20:33:55 crc kubenswrapper[4773]: I1012 20:33:55.044878 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" Oct 12 20:33:55 crc kubenswrapper[4773]: I1012 20:33:55.218213 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qbzg6"] Oct 12 20:33:55 crc kubenswrapper[4773]: I1012 20:33:55.240324 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 20:33:55 crc kubenswrapper[4773]: I1012 20:33:55.245294 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-lshqf"] Oct 12 20:33:55 crc kubenswrapper[4773]: W1012 20:33:55.251943 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c1610be_cf14_4659_8bf8_46cbcb55aa47.slice/crio-bf1cc1681082fd8f3222e86f449130cf56d15ded8dfcc12c060ef1d74f412376 WatchSource:0}: Error finding container bf1cc1681082fd8f3222e86f449130cf56d15ded8dfcc12c060ef1d74f412376: Status 404 returned error can't find the container with id bf1cc1681082fd8f3222e86f449130cf56d15ded8dfcc12c060ef1d74f412376 Oct 12 20:33:55 crc kubenswrapper[4773]: I1012 20:33:55.312256 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gh5rv"] Oct 12 20:33:55 crc kubenswrapper[4773]: I1012 20:33:55.953449 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qbzg6" event={"ID":"6003117d-518b-4b81-98ba-01ffbdea09c7","Type":"ContainerStarted","Data":"5cb347fe279594e729320477538e3041deb37c29cace7425c6c3879181e9e113"} Oct 12 20:33:55 crc kubenswrapper[4773]: I1012 20:33:55.955823 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" event={"ID":"882eaacb-03d9-4250-ab13-b702c4f4b91c","Type":"ContainerStarted","Data":"c11ab08782ee74a7678c8b1956c6cdd1f4fcbf503199ca681ac73faf9d2b6cf9"} Oct 12 20:33:55 crc kubenswrapper[4773]: I1012 20:33:55.956582 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-lshqf" event={"ID":"5c1610be-cf14-4659-8bf8-46cbcb55aa47","Type":"ContainerStarted","Data":"bf1cc1681082fd8f3222e86f449130cf56d15ded8dfcc12c060ef1d74f412376"} Oct 12 20:33:57 crc kubenswrapper[4773]: I1012 20:33:57.970636 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-lshqf" event={"ID":"5c1610be-cf14-4659-8bf8-46cbcb55aa47","Type":"ContainerStarted","Data":"3acd2f5dab397a9e7e6a52bbf4dd7f4748288fc1a01150bc488ac9e799cfd9cf"} Oct 12 20:33:57 crc kubenswrapper[4773]: I1012 20:33:57.973665 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qbzg6" event={"ID":"6003117d-518b-4b81-98ba-01ffbdea09c7","Type":"ContainerStarted","Data":"8b906970f7cdbfe1396a64a8229354d1cc5696ed30a2b8b3b0a9d09d38cd45b9"} Oct 12 20:33:57 crc kubenswrapper[4773]: I1012 20:33:57.986508 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-lshqf" podStartSLOduration=1.685815967 podStartE2EDuration="3.986492716s" podCreationTimestamp="2025-10-12 20:33:54 +0000 UTC" firstStartedPulling="2025-10-12 20:33:55.254221933 +0000 UTC m=+583.490520493" lastFinishedPulling="2025-10-12 20:33:57.554898682 +0000 UTC m=+585.791197242" observedRunningTime="2025-10-12 20:33:57.983908624 +0000 UTC m=+586.220207214" watchObservedRunningTime="2025-10-12 20:33:57.986492716 +0000 UTC m=+586.222791276" Oct 12 20:33:57 crc kubenswrapper[4773]: I1012 20:33:57.998537 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-qbzg6" podStartSLOduration=1.6347249160000001 podStartE2EDuration="3.99851799s" podCreationTimestamp="2025-10-12 20:33:54 +0000 UTC" firstStartedPulling="2025-10-12 20:33:55.24006783 +0000 UTC m=+583.476366390" lastFinishedPulling="2025-10-12 20:33:57.603860904 +0000 UTC m=+585.840159464" observedRunningTime="2025-10-12 20:33:57.998425908 +0000 UTC m=+586.234724468" watchObservedRunningTime="2025-10-12 20:33:57.99851799 +0000 UTC m=+586.234816550" Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.669637 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.669694 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.669751 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.670297 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50cf730b0a664aa6273b5384482669e8042ad9c84abc280e1e2b88cfe6018b4b"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.670357 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://50cf730b0a664aa6273b5384482669e8042ad9c84abc280e1e2b88cfe6018b4b" gracePeriod=600 Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.981273 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" event={"ID":"882eaacb-03d9-4250-ab13-b702c4f4b91c","Type":"ContainerStarted","Data":"888a1eddf0ce6c21dd82210919566f14db1fcc7b0d3219a97cc16ec64689aed8"} Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.981624 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.983901 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="50cf730b0a664aa6273b5384482669e8042ad9c84abc280e1e2b88cfe6018b4b" exitCode=0 Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.983953 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"50cf730b0a664aa6273b5384482669e8042ad9c84abc280e1e2b88cfe6018b4b"} Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.983991 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"f52dd857ebd7841601e1ebc902a98c37025d34641286d29646b2dbc4969a08aa"} Oct 12 20:33:58 crc kubenswrapper[4773]: I1012 20:33:58.984008 4773 scope.go:117] "RemoveContainer" containerID="971ea4f63e4f5f53d552a948f94e7131fae037c3dab2edf2ed90e1f0e80cdf66" Oct 12 20:33:59 crc kubenswrapper[4773]: I1012 20:33:59.023287 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" podStartSLOduration=1.787762593 podStartE2EDuration="5.02326697s" podCreationTimestamp="2025-10-12 20:33:54 +0000 UTC" firstStartedPulling="2025-10-12 20:33:55.327333507 +0000 UTC m=+583.563632067" lastFinishedPulling="2025-10-12 20:33:58.562837854 +0000 UTC m=+586.799136444" observedRunningTime="2025-10-12 20:33:59.003037238 +0000 UTC m=+587.239335798" watchObservedRunningTime="2025-10-12 20:33:59.02326697 +0000 UTC m=+587.259565530" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.048410 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-gh5rv" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.382250 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tzm6q"] Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.382875 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovn-controller" containerID="cri-o://65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03" gracePeriod=30 Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.382984 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58" gracePeriod=30 Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.383035 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kube-rbac-proxy-node" containerID="cri-o://323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f" gracePeriod=30 Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.383069 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovn-acl-logging" containerID="cri-o://ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453" gracePeriod=30 Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.383153 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="sbdb" containerID="cri-o://471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994" gracePeriod=30 Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.383156 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="northd" containerID="cri-o://bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f" gracePeriod=30 Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.383276 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="nbdb" containerID="cri-o://131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109" gracePeriod=30 Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.426210 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" containerID="cri-o://5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f" gracePeriod=30 Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.722568 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/3.log" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.724687 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovn-acl-logging/0.log" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.725300 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovn-controller/0.log" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.725832 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770154 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-psd7k"] Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770335 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovn-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770350 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovn-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770365 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kube-rbac-proxy-ovn-metrics" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770372 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kube-rbac-proxy-ovn-metrics" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770379 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="nbdb" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770386 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="nbdb" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770396 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovn-acl-logging" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770402 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovn-acl-logging" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770411 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kube-rbac-proxy-node" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770417 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kube-rbac-proxy-node" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770424 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="northd" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770429 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="northd" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770437 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="sbdb" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770442 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="sbdb" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770450 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770456 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770464 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770469 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770478 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770484 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770490 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770496 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770504 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770509 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: E1012 20:34:05.770515 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kubecfg-setup" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770522 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kubecfg-setup" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770608 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770616 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kube-rbac-proxy-ovn-metrics" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770624 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovn-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770634 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770640 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="kube-rbac-proxy-node" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770647 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="sbdb" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770656 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770663 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovn-acl-logging" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770671 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770678 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="nbdb" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770686 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="northd" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.770852 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerName="ovnkube-controller" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.772292 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.881565 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-config\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.881846 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-script-lib\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.882378 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-netns\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.882537 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-etc-openvswitch\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.882673 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnldc\" (UniqueName: \"kubernetes.io/projected/9bd89b89-9347-4b0d-8861-4ff26c9640b5-kube-api-access-wnldc\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.882159 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883385 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-ovn-kubernetes\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883447 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-bin\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.882321 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.882497 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.882622 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883504 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovn-node-metrics-cert\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883547 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-systemd\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883587 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-netd\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883630 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-log-socket\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-env-overrides\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883782 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-openvswitch\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883864 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-node-log\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-var-lib-openvswitch\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883972 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884042 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-kubelet\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884090 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-systemd-units\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884181 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-ovn\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884240 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-slash\") pod \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\" (UID: \"9bd89b89-9347-4b0d-8861-4ff26c9640b5\") " Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883417 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884383 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883527 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883847 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-log-socket" (OuterVolumeSpecName: "log-socket") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.883871 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884425 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884450 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884443 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884472 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-node-log" (OuterVolumeSpecName: "node-log") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884476 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884522 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884760 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-slash" (OuterVolumeSpecName: "host-slash") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-kubelet\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.884918 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885053 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-slash\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45dd6108-51bb-40c8-a587-723667eb0383-ovnkube-config\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45dd6108-51bb-40c8-a587-723667eb0383-ovn-node-metrics-cert\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885413 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-log-socket\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-run-ovn\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885563 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-cni-netd\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885596 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-cni-bin\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-systemd-units\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885776 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-run-openvswitch\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885812 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45dd6108-51bb-40c8-a587-723667eb0383-ovnkube-script-lib\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885872 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-node-log\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.885948 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-run-ovn-kubernetes\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886001 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-var-lib-openvswitch\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886051 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-run-netns\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886130 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886195 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-etc-openvswitch\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886231 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45dd6108-51bb-40c8-a587-723667eb0383-env-overrides\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886328 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-run-systemd\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886490 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkxzr\" (UniqueName: \"kubernetes.io/projected/45dd6108-51bb-40c8-a587-723667eb0383-kube-api-access-fkxzr\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886598 4773 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886653 4773 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-log-socket\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886703 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886779 4773 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.886833 4773 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-node-log\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887019 4773 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887078 4773 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887141 4773 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887200 4773 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887257 4773 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887355 4773 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-slash\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887417 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887474 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887535 4773 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887590 4773 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887649 4773 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.887708 4773 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.888333 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.889155 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd89b89-9347-4b0d-8861-4ff26c9640b5-kube-api-access-wnldc" (OuterVolumeSpecName: "kube-api-access-wnldc") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "kube-api-access-wnldc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.895523 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9bd89b89-9347-4b0d-8861-4ff26c9640b5" (UID: "9bd89b89-9347-4b0d-8861-4ff26c9640b5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989254 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-run-systemd\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989306 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkxzr\" (UniqueName: \"kubernetes.io/projected/45dd6108-51bb-40c8-a587-723667eb0383-kube-api-access-fkxzr\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989333 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-kubelet\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989368 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-slash\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989396 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45dd6108-51bb-40c8-a587-723667eb0383-ovnkube-config\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45dd6108-51bb-40c8-a587-723667eb0383-ovn-node-metrics-cert\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-log-socket\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989424 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-run-systemd\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-run-ovn\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989517 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-cni-netd\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989548 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-cni-bin\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989583 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-systemd-units\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989608 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-run-openvswitch\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989630 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45dd6108-51bb-40c8-a587-723667eb0383-ovnkube-script-lib\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989707 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-node-log\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989763 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-run-ovn-kubernetes\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989779 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-kubelet\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989799 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-var-lib-openvswitch\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989824 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-slash\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989840 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-run-netns\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989858 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-systemd-units\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989482 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-run-ovn\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989891 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989906 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-cni-netd\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989935 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-cni-bin\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989948 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-etc-openvswitch\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.989971 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45dd6108-51bb-40c8-a587-723667eb0383-env-overrides\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990042 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnldc\" (UniqueName: \"kubernetes.io/projected/9bd89b89-9347-4b0d-8861-4ff26c9640b5-kube-api-access-wnldc\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990056 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bd89b89-9347-4b0d-8861-4ff26c9640b5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990069 4773 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bd89b89-9347-4b0d-8861-4ff26c9640b5-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990546 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/45dd6108-51bb-40c8-a587-723667eb0383-ovnkube-config\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990590 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-log-socket\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990627 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-run-ovn-kubernetes\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990651 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/45dd6108-51bb-40c8-a587-723667eb0383-env-overrides\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-run-openvswitch\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990700 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-var-lib-openvswitch\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-run-netns\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990791 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990821 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-etc-openvswitch\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.990850 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/45dd6108-51bb-40c8-a587-723667eb0383-node-log\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.991245 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/45dd6108-51bb-40c8-a587-723667eb0383-ovnkube-script-lib\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:05 crc kubenswrapper[4773]: I1012 20:34:05.994999 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/45dd6108-51bb-40c8-a587-723667eb0383-ovn-node-metrics-cert\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.005364 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkxzr\" (UniqueName: \"kubernetes.io/projected/45dd6108-51bb-40c8-a587-723667eb0383-kube-api-access-fkxzr\") pod \"ovnkube-node-psd7k\" (UID: \"45dd6108-51bb-40c8-a587-723667eb0383\") " pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.030797 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovnkube-controller/3.log" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.033381 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovn-acl-logging/0.log" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.033862 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzm6q_9bd89b89-9347-4b0d-8861-4ff26c9640b5/ovn-controller/0.log" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034201 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f" exitCode=0 Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034223 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994" exitCode=0 Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034230 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109" exitCode=0 Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034239 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f" exitCode=0 Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034245 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58" exitCode=0 Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034252 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f" exitCode=0 Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034258 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453" exitCode=143 Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034267 4773 generic.go:334] "Generic (PLEG): container finished" podID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" containerID="65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03" exitCode=143 Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034315 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034351 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034361 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034370 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034379 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034389 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034398 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034403 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034409 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034414 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034419 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034424 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034429 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034434 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034440 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034447 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034453 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034459 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034464 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034469 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034474 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034479 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034484 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034491 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034496 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034503 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034510 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034517 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034522 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034528 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034533 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034539 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034544 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034551 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034556 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034561 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034567 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" event={"ID":"9bd89b89-9347-4b0d-8861-4ff26c9640b5","Type":"ContainerDied","Data":"7c19adea7522cbf456c1874092f7c4c3a34f13a4c3c3fba58192f3d62ce50ab6"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034575 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034580 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034586 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034592 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034598 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034604 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034609 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034615 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034620 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034625 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034638 4773 scope.go:117] "RemoveContainer" containerID="5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.034781 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tzm6q" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.042323 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/2.log" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.042820 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/1.log" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.042859 4773 generic.go:334] "Generic (PLEG): container finished" podID="69ad9308-d890-40f4-9b73-fb4aad78ccd1" containerID="47a9e1c3c8960606e8d7f5b84a070fd4d124286a85941d89f2b1e1c90998c126" exitCode=2 Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.042887 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67c6h" event={"ID":"69ad9308-d890-40f4-9b73-fb4aad78ccd1","Type":"ContainerDied","Data":"47a9e1c3c8960606e8d7f5b84a070fd4d124286a85941d89f2b1e1c90998c126"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.042912 4773 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec"} Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.043327 4773 scope.go:117] "RemoveContainer" containerID="47a9e1c3c8960606e8d7f5b84a070fd4d124286a85941d89f2b1e1c90998c126" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.043617 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-67c6h_openshift-multus(69ad9308-d890-40f4-9b73-fb4aad78ccd1)\"" pod="openshift-multus/multus-67c6h" podUID="69ad9308-d890-40f4-9b73-fb4aad78ccd1" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.058758 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.083446 4773 scope.go:117] "RemoveContainer" containerID="471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.094265 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tzm6q"] Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.101543 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tzm6q"] Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.110019 4773 scope.go:117] "RemoveContainer" containerID="131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.122432 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.127452 4773 scope.go:117] "RemoveContainer" containerID="bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.145509 4773 scope.go:117] "RemoveContainer" containerID="e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58" Oct 12 20:34:06 crc kubenswrapper[4773]: W1012 20:34:06.150778 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45dd6108_51bb_40c8_a587_723667eb0383.slice/crio-fac10623a796bbc9b9c3b3060401fa995f0a216f7dc188a469f49b0455fe7d6b WatchSource:0}: Error finding container fac10623a796bbc9b9c3b3060401fa995f0a216f7dc188a469f49b0455fe7d6b: Status 404 returned error can't find the container with id fac10623a796bbc9b9c3b3060401fa995f0a216f7dc188a469f49b0455fe7d6b Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.180769 4773 scope.go:117] "RemoveContainer" containerID="323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.197014 4773 scope.go:117] "RemoveContainer" containerID="ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.209638 4773 scope.go:117] "RemoveContainer" containerID="65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.230313 4773 scope.go:117] "RemoveContainer" containerID="05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.243007 4773 scope.go:117] "RemoveContainer" containerID="5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.243315 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": container with ID starting with 5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f not found: ID does not exist" containerID="5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.243357 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} err="failed to get container status \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": rpc error: code = NotFound desc = could not find container \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": container with ID starting with 5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.243385 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.243768 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\": container with ID starting with 0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0 not found: ID does not exist" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.243809 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} err="failed to get container status \"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\": rpc error: code = NotFound desc = could not find container \"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\": container with ID starting with 0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.243836 4773 scope.go:117] "RemoveContainer" containerID="471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.244193 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\": container with ID starting with 471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994 not found: ID does not exist" containerID="471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.244213 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} err="failed to get container status \"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\": rpc error: code = NotFound desc = could not find container \"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\": container with ID starting with 471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.244230 4773 scope.go:117] "RemoveContainer" containerID="131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.244452 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\": container with ID starting with 131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109 not found: ID does not exist" containerID="131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.244475 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} err="failed to get container status \"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\": rpc error: code = NotFound desc = could not find container \"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\": container with ID starting with 131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.244489 4773 scope.go:117] "RemoveContainer" containerID="bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.244856 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\": container with ID starting with bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f not found: ID does not exist" containerID="bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.244887 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} err="failed to get container status \"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\": rpc error: code = NotFound desc = could not find container \"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\": container with ID starting with bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.244910 4773 scope.go:117] "RemoveContainer" containerID="e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.245167 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\": container with ID starting with e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58 not found: ID does not exist" containerID="e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.245216 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} err="failed to get container status \"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\": rpc error: code = NotFound desc = could not find container \"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\": container with ID starting with e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.245287 4773 scope.go:117] "RemoveContainer" containerID="323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.245613 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\": container with ID starting with 323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f not found: ID does not exist" containerID="323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.245643 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} err="failed to get container status \"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\": rpc error: code = NotFound desc = could not find container \"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\": container with ID starting with 323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.245662 4773 scope.go:117] "RemoveContainer" containerID="ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.245965 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\": container with ID starting with ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453 not found: ID does not exist" containerID="ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.245994 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} err="failed to get container status \"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\": rpc error: code = NotFound desc = could not find container \"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\": container with ID starting with ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.246010 4773 scope.go:117] "RemoveContainer" containerID="65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.246321 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\": container with ID starting with 65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03 not found: ID does not exist" containerID="65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.246362 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} err="failed to get container status \"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\": rpc error: code = NotFound desc = could not find container \"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\": container with ID starting with 65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.246386 4773 scope.go:117] "RemoveContainer" containerID="05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc" Oct 12 20:34:06 crc kubenswrapper[4773]: E1012 20:34:06.246736 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\": container with ID starting with 05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc not found: ID does not exist" containerID="05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.246766 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc"} err="failed to get container status \"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\": rpc error: code = NotFound desc = could not find container \"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\": container with ID starting with 05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.246783 4773 scope.go:117] "RemoveContainer" containerID="5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.247235 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} err="failed to get container status \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": rpc error: code = NotFound desc = could not find container \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": container with ID starting with 5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.247261 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.247777 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} err="failed to get container status \"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\": rpc error: code = NotFound desc = could not find container \"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\": container with ID starting with 0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.247798 4773 scope.go:117] "RemoveContainer" containerID="471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.248061 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} err="failed to get container status \"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\": rpc error: code = NotFound desc = could not find container \"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\": container with ID starting with 471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.248078 4773 scope.go:117] "RemoveContainer" containerID="131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.248368 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} err="failed to get container status \"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\": rpc error: code = NotFound desc = could not find container \"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\": container with ID starting with 131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.248394 4773 scope.go:117] "RemoveContainer" containerID="bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.248708 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} err="failed to get container status \"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\": rpc error: code = NotFound desc = could not find container \"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\": container with ID starting with bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.248764 4773 scope.go:117] "RemoveContainer" containerID="e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.249046 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} err="failed to get container status \"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\": rpc error: code = NotFound desc = could not find container \"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\": container with ID starting with e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.249071 4773 scope.go:117] "RemoveContainer" containerID="323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.249392 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} err="failed to get container status \"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\": rpc error: code = NotFound desc = could not find container \"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\": container with ID starting with 323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.249425 4773 scope.go:117] "RemoveContainer" containerID="ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.249755 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} err="failed to get container status \"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\": rpc error: code = NotFound desc = could not find container \"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\": container with ID starting with ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.249781 4773 scope.go:117] "RemoveContainer" containerID="65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.250179 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} err="failed to get container status \"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\": rpc error: code = NotFound desc = could not find container \"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\": container with ID starting with 65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.250238 4773 scope.go:117] "RemoveContainer" containerID="05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.250674 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc"} err="failed to get container status \"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\": rpc error: code = NotFound desc = could not find container \"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\": container with ID starting with 05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.250750 4773 scope.go:117] "RemoveContainer" containerID="5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.251061 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} err="failed to get container status \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": rpc error: code = NotFound desc = could not find container \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": container with ID starting with 5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.251087 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.251424 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} err="failed to get container status \"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\": rpc error: code = NotFound desc = could not find container \"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\": container with ID starting with 0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.251451 4773 scope.go:117] "RemoveContainer" containerID="471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.251939 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} err="failed to get container status \"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\": rpc error: code = NotFound desc = could not find container \"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\": container with ID starting with 471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.251967 4773 scope.go:117] "RemoveContainer" containerID="131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.252350 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} err="failed to get container status \"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\": rpc error: code = NotFound desc = could not find container \"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\": container with ID starting with 131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.252375 4773 scope.go:117] "RemoveContainer" containerID="bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.253397 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} err="failed to get container status \"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\": rpc error: code = NotFound desc = could not find container \"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\": container with ID starting with bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.253425 4773 scope.go:117] "RemoveContainer" containerID="e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.253957 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} err="failed to get container status \"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\": rpc error: code = NotFound desc = could not find container \"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\": container with ID starting with e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.253987 4773 scope.go:117] "RemoveContainer" containerID="323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.254341 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} err="failed to get container status \"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\": rpc error: code = NotFound desc = could not find container \"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\": container with ID starting with 323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.254360 4773 scope.go:117] "RemoveContainer" containerID="ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.254737 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} err="failed to get container status \"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\": rpc error: code = NotFound desc = could not find container \"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\": container with ID starting with ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.254758 4773 scope.go:117] "RemoveContainer" containerID="65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.255070 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} err="failed to get container status \"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\": rpc error: code = NotFound desc = could not find container \"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\": container with ID starting with 65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.255097 4773 scope.go:117] "RemoveContainer" containerID="05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.255420 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc"} err="failed to get container status \"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\": rpc error: code = NotFound desc = could not find container \"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\": container with ID starting with 05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.255437 4773 scope.go:117] "RemoveContainer" containerID="5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.255761 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} err="failed to get container status \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": rpc error: code = NotFound desc = could not find container \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": container with ID starting with 5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.255782 4773 scope.go:117] "RemoveContainer" containerID="0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.256079 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0"} err="failed to get container status \"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\": rpc error: code = NotFound desc = could not find container \"0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0\": container with ID starting with 0c78bb477bb55adcaf27a97881511f7b792c55f12a0a518078e25335a04dc1e0 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.256165 4773 scope.go:117] "RemoveContainer" containerID="471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.256541 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994"} err="failed to get container status \"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\": rpc error: code = NotFound desc = could not find container \"471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994\": container with ID starting with 471599cfb2e0959e6b1c7578ba67e663ad6ba11a07e20a3b0401c95bb32be994 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.256562 4773 scope.go:117] "RemoveContainer" containerID="131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.256845 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109"} err="failed to get container status \"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\": rpc error: code = NotFound desc = could not find container \"131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109\": container with ID starting with 131b2b68279ca82a2bc8986fa36c017cd4fd7adea182820fc479fd1887438109 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.256861 4773 scope.go:117] "RemoveContainer" containerID="bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.257087 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f"} err="failed to get container status \"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\": rpc error: code = NotFound desc = could not find container \"bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f\": container with ID starting with bdb6bf4aa127a26ffb3a3d7f2552096739ab446105436cebf2cf0f16cfda976f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.257124 4773 scope.go:117] "RemoveContainer" containerID="e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.257368 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58"} err="failed to get container status \"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\": rpc error: code = NotFound desc = could not find container \"e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58\": container with ID starting with e05d4744481b204f0ab3c1acc4afe8e94c7ca13a010dbe56d2ffa20308af1e58 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.257387 4773 scope.go:117] "RemoveContainer" containerID="323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.257684 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f"} err="failed to get container status \"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\": rpc error: code = NotFound desc = could not find container \"323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f\": container with ID starting with 323d3eac95d34b7fd7f2f70eb397eb3a6344a8431f0c6138c59cc2babcc0d89f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.257703 4773 scope.go:117] "RemoveContainer" containerID="ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.257969 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453"} err="failed to get container status \"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\": rpc error: code = NotFound desc = could not find container \"ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453\": container with ID starting with ebc41c957634aea3bcd9c19b53d0a4739bc0a2052121007efaac6d658e12a453 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.258010 4773 scope.go:117] "RemoveContainer" containerID="65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.258256 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03"} err="failed to get container status \"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\": rpc error: code = NotFound desc = could not find container \"65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03\": container with ID starting with 65ee08b5a647d740fe114dfc97394877f3f3e0eb8f66d9d3ba591068beb35e03 not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.258283 4773 scope.go:117] "RemoveContainer" containerID="05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.258560 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc"} err="failed to get container status \"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\": rpc error: code = NotFound desc = could not find container \"05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc\": container with ID starting with 05a1cf0310ed16601627814f5bf13ea35be960bd2de8606c0b64c343a4624ddc not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.258578 4773 scope.go:117] "RemoveContainer" containerID="5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.258974 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f"} err="failed to get container status \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": rpc error: code = NotFound desc = could not find container \"5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f\": container with ID starting with 5ee6adc98c89911d57e32caab40ef832e37b8324ad5bb6855bbd9e4d89a22b8f not found: ID does not exist" Oct 12 20:34:06 crc kubenswrapper[4773]: I1012 20:34:06.496789 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd89b89-9347-4b0d-8861-4ff26c9640b5" path="/var/lib/kubelet/pods/9bd89b89-9347-4b0d-8861-4ff26c9640b5/volumes" Oct 12 20:34:07 crc kubenswrapper[4773]: I1012 20:34:07.053559 4773 generic.go:334] "Generic (PLEG): container finished" podID="45dd6108-51bb-40c8-a587-723667eb0383" containerID="20fedd553bb84c92ff20f62891977789a5b9183a450cc5b82a489e5c891cb96b" exitCode=0 Oct 12 20:34:07 crc kubenswrapper[4773]: I1012 20:34:07.053608 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerDied","Data":"20fedd553bb84c92ff20f62891977789a5b9183a450cc5b82a489e5c891cb96b"} Oct 12 20:34:07 crc kubenswrapper[4773]: I1012 20:34:07.053637 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerStarted","Data":"fac10623a796bbc9b9c3b3060401fa995f0a216f7dc188a469f49b0455fe7d6b"} Oct 12 20:34:08 crc kubenswrapper[4773]: I1012 20:34:08.064522 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerStarted","Data":"dbf428865f8ab2979c8fad5b6d36e8f61ab17f47cfe0369ce76b3ef48fa11fc6"} Oct 12 20:34:08 crc kubenswrapper[4773]: I1012 20:34:08.065122 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerStarted","Data":"43ba1af52ff64bd31cfb5d36cc0508f8eba50a6e0490d7a66a98fe5f23e922b5"} Oct 12 20:34:08 crc kubenswrapper[4773]: I1012 20:34:08.065137 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerStarted","Data":"4e115dfc86ed16c3814a75b51b3a971751e8199455d50c84c99c05e66f7a58c9"} Oct 12 20:34:08 crc kubenswrapper[4773]: I1012 20:34:08.065149 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerStarted","Data":"b1b7083077931bbb3ed1acacc9736495245539e48d12a143259ae9f3c4614dec"} Oct 12 20:34:08 crc kubenswrapper[4773]: I1012 20:34:08.065162 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerStarted","Data":"866414d930301988a8a13db1208396310d964e7717e7ddd6dcd76f529c5ecf5d"} Oct 12 20:34:08 crc kubenswrapper[4773]: I1012 20:34:08.065173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerStarted","Data":"244380d95b268fe6eaa313385c32b1e03c0cdc2499dfa77a69429d05a1bd18c0"} Oct 12 20:34:11 crc kubenswrapper[4773]: I1012 20:34:11.101877 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerStarted","Data":"4cd0f64bffa8368e8f3164c58065db8d9484ed07fac4b4548335e3e80d6e05a7"} Oct 12 20:34:12 crc kubenswrapper[4773]: I1012 20:34:12.648294 4773 scope.go:117] "RemoveContainer" containerID="4b156985db6e08b52f9f20c1f93eac756d70cc6170475213bdd419e7eb6c4dec" Oct 12 20:34:13 crc kubenswrapper[4773]: I1012 20:34:13.121355 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" event={"ID":"45dd6108-51bb-40c8-a587-723667eb0383","Type":"ContainerStarted","Data":"6c0550545cb6d0cf74914027bd817176474781ce8ddfb82faa09e709db4d4c90"} Oct 12 20:34:13 crc kubenswrapper[4773]: I1012 20:34:13.121403 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:13 crc kubenswrapper[4773]: I1012 20:34:13.121459 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:13 crc kubenswrapper[4773]: I1012 20:34:13.121541 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:13 crc kubenswrapper[4773]: I1012 20:34:13.128519 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/2.log" Oct 12 20:34:13 crc kubenswrapper[4773]: I1012 20:34:13.168260 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:13 crc kubenswrapper[4773]: I1012 20:34:13.172657 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" podStartSLOduration=8.172634512 podStartE2EDuration="8.172634512s" podCreationTimestamp="2025-10-12 20:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:34:13.170079511 +0000 UTC m=+601.406378121" watchObservedRunningTime="2025-10-12 20:34:13.172634512 +0000 UTC m=+601.408933102" Oct 12 20:34:13 crc kubenswrapper[4773]: I1012 20:34:13.186444 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:17 crc kubenswrapper[4773]: I1012 20:34:17.480933 4773 scope.go:117] "RemoveContainer" containerID="47a9e1c3c8960606e8d7f5b84a070fd4d124286a85941d89f2b1e1c90998c126" Oct 12 20:34:17 crc kubenswrapper[4773]: E1012 20:34:17.481629 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-67c6h_openshift-multus(69ad9308-d890-40f4-9b73-fb4aad78ccd1)\"" pod="openshift-multus/multus-67c6h" podUID="69ad9308-d890-40f4-9b73-fb4aad78ccd1" Oct 12 20:34:32 crc kubenswrapper[4773]: I1012 20:34:32.485782 4773 scope.go:117] "RemoveContainer" containerID="47a9e1c3c8960606e8d7f5b84a070fd4d124286a85941d89f2b1e1c90998c126" Oct 12 20:34:33 crc kubenswrapper[4773]: I1012 20:34:33.245525 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67c6h_69ad9308-d890-40f4-9b73-fb4aad78ccd1/kube-multus/2.log" Oct 12 20:34:33 crc kubenswrapper[4773]: I1012 20:34:33.245868 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67c6h" event={"ID":"69ad9308-d890-40f4-9b73-fb4aad78ccd1","Type":"ContainerStarted","Data":"805347bb96f02bdbb0decc4bc55cb69ba0634057d04938243d64622d48dbf177"} Oct 12 20:34:36 crc kubenswrapper[4773]: I1012 20:34:36.150399 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-psd7k" Oct 12 20:34:44 crc kubenswrapper[4773]: I1012 20:34:44.843164 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh"] Oct 12 20:34:44 crc kubenswrapper[4773]: I1012 20:34:44.844528 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:44 crc kubenswrapper[4773]: I1012 20:34:44.846686 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 12 20:34:44 crc kubenswrapper[4773]: I1012 20:34:44.888401 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh"] Oct 12 20:34:44 crc kubenswrapper[4773]: I1012 20:34:44.929143 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqx28\" (UniqueName: \"kubernetes.io/projected/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-kube-api-access-gqx28\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:44 crc kubenswrapper[4773]: I1012 20:34:44.929295 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:44 crc kubenswrapper[4773]: I1012 20:34:44.929424 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:45 crc kubenswrapper[4773]: I1012 20:34:45.031964 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqx28\" (UniqueName: \"kubernetes.io/projected/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-kube-api-access-gqx28\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:45 crc kubenswrapper[4773]: I1012 20:34:45.032093 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:45 crc kubenswrapper[4773]: I1012 20:34:45.032199 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:45 crc kubenswrapper[4773]: I1012 20:34:45.032919 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:45 crc kubenswrapper[4773]: I1012 20:34:45.032982 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:45 crc kubenswrapper[4773]: I1012 20:34:45.050327 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqx28\" (UniqueName: \"kubernetes.io/projected/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-kube-api-access-gqx28\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:45 crc kubenswrapper[4773]: I1012 20:34:45.159883 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:45 crc kubenswrapper[4773]: I1012 20:34:45.359333 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh"] Oct 12 20:34:46 crc kubenswrapper[4773]: I1012 20:34:46.336150 4773 generic.go:334] "Generic (PLEG): container finished" podID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerID="ecae3cda117fe3bcde948035052d5430b1ee7a4b4fd8761aca205f4f418aee0a" exitCode=0 Oct 12 20:34:46 crc kubenswrapper[4773]: I1012 20:34:46.336262 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" event={"ID":"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f","Type":"ContainerDied","Data":"ecae3cda117fe3bcde948035052d5430b1ee7a4b4fd8761aca205f4f418aee0a"} Oct 12 20:34:46 crc kubenswrapper[4773]: I1012 20:34:46.337838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" event={"ID":"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f","Type":"ContainerStarted","Data":"ad6d8125bbcacf5b4b9a66ef1b1d6501e2ba53ab073975b5a2a5a483498fa42d"} Oct 12 20:34:48 crc kubenswrapper[4773]: I1012 20:34:48.355125 4773 generic.go:334] "Generic (PLEG): container finished" podID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerID="d020d7c8976d4edab75667bbdc9b4d33a910ceecf4479cf949f5c83b73627dc2" exitCode=0 Oct 12 20:34:48 crc kubenswrapper[4773]: I1012 20:34:48.355228 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" event={"ID":"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f","Type":"ContainerDied","Data":"d020d7c8976d4edab75667bbdc9b4d33a910ceecf4479cf949f5c83b73627dc2"} Oct 12 20:34:49 crc kubenswrapper[4773]: I1012 20:34:49.364641 4773 generic.go:334] "Generic (PLEG): container finished" podID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerID="37cad0f1356e913d8db080dcfdff1d8e6eacab7dd78309da19b28e08a4bd6039" exitCode=0 Oct 12 20:34:49 crc kubenswrapper[4773]: I1012 20:34:49.364753 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" event={"ID":"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f","Type":"ContainerDied","Data":"37cad0f1356e913d8db080dcfdff1d8e6eacab7dd78309da19b28e08a4bd6039"} Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.640390 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.721644 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqx28\" (UniqueName: \"kubernetes.io/projected/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-kube-api-access-gqx28\") pod \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.721753 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-bundle\") pod \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.721882 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-util\") pod \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\" (UID: \"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f\") " Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.723021 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-bundle" (OuterVolumeSpecName: "bundle") pod "2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" (UID: "2aa293c3-6ed8-473b-b2dd-ec4a0515d08f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.739881 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-kube-api-access-gqx28" (OuterVolumeSpecName: "kube-api-access-gqx28") pod "2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" (UID: "2aa293c3-6ed8-473b-b2dd-ec4a0515d08f"). InnerVolumeSpecName "kube-api-access-gqx28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.741018 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-util" (OuterVolumeSpecName: "util") pod "2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" (UID: "2aa293c3-6ed8-473b-b2dd-ec4a0515d08f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.823610 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqx28\" (UniqueName: \"kubernetes.io/projected/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-kube-api-access-gqx28\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.823649 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:50 crc kubenswrapper[4773]: I1012 20:34:50.823660 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2aa293c3-6ed8-473b-b2dd-ec4a0515d08f-util\") on node \"crc\" DevicePath \"\"" Oct 12 20:34:51 crc kubenswrapper[4773]: I1012 20:34:51.380042 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" event={"ID":"2aa293c3-6ed8-473b-b2dd-ec4a0515d08f","Type":"ContainerDied","Data":"ad6d8125bbcacf5b4b9a66ef1b1d6501e2ba53ab073975b5a2a5a483498fa42d"} Oct 12 20:34:51 crc kubenswrapper[4773]: I1012 20:34:51.380427 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad6d8125bbcacf5b4b9a66ef1b1d6501e2ba53ab073975b5a2a5a483498fa42d" Oct 12 20:34:51 crc kubenswrapper[4773]: I1012 20:34:51.380110 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.172189 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w"] Oct 12 20:34:53 crc kubenswrapper[4773]: E1012 20:34:53.172385 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerName="extract" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.172396 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerName="extract" Oct 12 20:34:53 crc kubenswrapper[4773]: E1012 20:34:53.172407 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerName="pull" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.172412 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerName="pull" Oct 12 20:34:53 crc kubenswrapper[4773]: E1012 20:34:53.172424 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerName="util" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.172429 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerName="util" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.172543 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa293c3-6ed8-473b-b2dd-ec4a0515d08f" containerName="extract" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.172933 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.175737 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.176043 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.176267 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ckg8b" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.186727 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w"] Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.259017 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhddn\" (UniqueName: \"kubernetes.io/projected/c290e672-35df-4626-8034-095052214269-kube-api-access-vhddn\") pod \"nmstate-operator-858ddd8f98-5qz2w\" (UID: \"c290e672-35df-4626-8034-095052214269\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.360633 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhddn\" (UniqueName: \"kubernetes.io/projected/c290e672-35df-4626-8034-095052214269-kube-api-access-vhddn\") pod \"nmstate-operator-858ddd8f98-5qz2w\" (UID: \"c290e672-35df-4626-8034-095052214269\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.377801 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhddn\" (UniqueName: \"kubernetes.io/projected/c290e672-35df-4626-8034-095052214269-kube-api-access-vhddn\") pod \"nmstate-operator-858ddd8f98-5qz2w\" (UID: \"c290e672-35df-4626-8034-095052214269\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.485741 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w" Oct 12 20:34:53 crc kubenswrapper[4773]: I1012 20:34:53.921093 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w"] Oct 12 20:34:53 crc kubenswrapper[4773]: W1012 20:34:53.926263 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc290e672_35df_4626_8034_095052214269.slice/crio-bef8937d5ca2641370526a790be57d82eed4cc54903bac8d0f5cfda4f6931668 WatchSource:0}: Error finding container bef8937d5ca2641370526a790be57d82eed4cc54903bac8d0f5cfda4f6931668: Status 404 returned error can't find the container with id bef8937d5ca2641370526a790be57d82eed4cc54903bac8d0f5cfda4f6931668 Oct 12 20:34:54 crc kubenswrapper[4773]: I1012 20:34:54.394553 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w" event={"ID":"c290e672-35df-4626-8034-095052214269","Type":"ContainerStarted","Data":"bef8937d5ca2641370526a790be57d82eed4cc54903bac8d0f5cfda4f6931668"} Oct 12 20:34:57 crc kubenswrapper[4773]: I1012 20:34:57.411964 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w" event={"ID":"c290e672-35df-4626-8034-095052214269","Type":"ContainerStarted","Data":"0f545e4a87a21c22f1e29d2a04244aaf8db66c0be5b51c68ae05b82b7497574b"} Oct 12 20:34:57 crc kubenswrapper[4773]: I1012 20:34:57.428323 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5qz2w" podStartSLOduration=1.9838910570000001 podStartE2EDuration="4.428294721s" podCreationTimestamp="2025-10-12 20:34:53 +0000 UTC" firstStartedPulling="2025-10-12 20:34:53.928119053 +0000 UTC m=+642.164417613" lastFinishedPulling="2025-10-12 20:34:56.372522717 +0000 UTC m=+644.608821277" observedRunningTime="2025-10-12 20:34:57.42610369 +0000 UTC m=+645.662402250" watchObservedRunningTime="2025-10-12 20:34:57.428294721 +0000 UTC m=+645.664593321" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.380746 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws"] Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.381531 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.385114 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hgzzk" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.392364 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64"] Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.393077 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.398295 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.405828 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws"] Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.420458 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64"] Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.422961 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmqt\" (UniqueName: \"kubernetes.io/projected/751cb256-7079-497a-a027-a9c295bc9832-kube-api-access-ffmqt\") pod \"nmstate-metrics-fdff9cb8d-8mnws\" (UID: \"751cb256-7079-497a-a027-a9c295bc9832\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.423034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fdf1901d-c523-4385-9415-fae96f1ea74c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-wzw64\" (UID: \"fdf1901d-c523-4385-9415-fae96f1ea74c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.423079 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhfx\" (UniqueName: \"kubernetes.io/projected/fdf1901d-c523-4385-9415-fae96f1ea74c-kube-api-access-gfhfx\") pod \"nmstate-webhook-6cdbc54649-wzw64\" (UID: \"fdf1901d-c523-4385-9415-fae96f1ea74c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.424155 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-gpbvq"] Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.424838 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.520932 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5"] Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.521709 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:34:58 crc kubenswrapper[4773]: W1012 20:34:58.524452 4773 reflector.go:561] object-"openshift-nmstate"/"default-dockercfg-hz5q8": failed to list *v1.Secret: secrets "default-dockercfg-hz5q8" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Oct 12 20:34:58 crc kubenswrapper[4773]: E1012 20:34:58.524496 4773 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"default-dockercfg-hz5q8\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-hz5q8\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.524737 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fdf1901d-c523-4385-9415-fae96f1ea74c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-wzw64\" (UID: \"fdf1901d-c523-4385-9415-fae96f1ea74c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.524790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zj5s\" (UniqueName: \"kubernetes.io/projected/04007580-35e5-42d5-84ec-1e44c4d6d914-kube-api-access-5zj5s\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.524817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhfx\" (UniqueName: \"kubernetes.io/projected/fdf1901d-c523-4385-9415-fae96f1ea74c-kube-api-access-gfhfx\") pod \"nmstate-webhook-6cdbc54649-wzw64\" (UID: \"fdf1901d-c523-4385-9415-fae96f1ea74c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.524870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/04007580-35e5-42d5-84ec-1e44c4d6d914-ovs-socket\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: E1012 20:34:58.524922 4773 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 12 20:34:58 crc kubenswrapper[4773]: E1012 20:34:58.524974 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdf1901d-c523-4385-9415-fae96f1ea74c-tls-key-pair podName:fdf1901d-c523-4385-9415-fae96f1ea74c nodeName:}" failed. No retries permitted until 2025-10-12 20:34:59.024959153 +0000 UTC m=+647.261257713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/fdf1901d-c523-4385-9415-fae96f1ea74c-tls-key-pair") pod "nmstate-webhook-6cdbc54649-wzw64" (UID: "fdf1901d-c523-4385-9415-fae96f1ea74c") : secret "openshift-nmstate-webhook" not found Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.525011 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/04007580-35e5-42d5-84ec-1e44c4d6d914-nmstate-lock\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.525032 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/04007580-35e5-42d5-84ec-1e44c4d6d914-dbus-socket\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.525050 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmqt\" (UniqueName: \"kubernetes.io/projected/751cb256-7079-497a-a027-a9c295bc9832-kube-api-access-ffmqt\") pod \"nmstate-metrics-fdff9cb8d-8mnws\" (UID: \"751cb256-7079-497a-a027-a9c295bc9832\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws" Oct 12 20:34:58 crc kubenswrapper[4773]: W1012 20:34:58.527546 4773 reflector.go:561] object-"openshift-nmstate"/"plugin-serving-cert": failed to list *v1.Secret: secrets "plugin-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Oct 12 20:34:58 crc kubenswrapper[4773]: E1012 20:34:58.527583 4773 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"plugin-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"plugin-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 20:34:58 crc kubenswrapper[4773]: W1012 20:34:58.535452 4773 reflector.go:561] object-"openshift-nmstate"/"nginx-conf": failed to list *v1.ConfigMap: configmaps "nginx-conf" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Oct 12 20:34:58 crc kubenswrapper[4773]: E1012 20:34:58.535491 4773 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"nginx-conf\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"nginx-conf\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.536902 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5"] Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.550585 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmqt\" (UniqueName: \"kubernetes.io/projected/751cb256-7079-497a-a027-a9c295bc9832-kube-api-access-ffmqt\") pod \"nmstate-metrics-fdff9cb8d-8mnws\" (UID: \"751cb256-7079-497a-a027-a9c295bc9832\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.561472 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhfx\" (UniqueName: \"kubernetes.io/projected/fdf1901d-c523-4385-9415-fae96f1ea74c-kube-api-access-gfhfx\") pod \"nmstate-webhook-6cdbc54649-wzw64\" (UID: \"fdf1901d-c523-4385-9415-fae96f1ea74c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627186 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/04007580-35e5-42d5-84ec-1e44c4d6d914-nmstate-lock\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627229 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/04007580-35e5-42d5-84ec-1e44c4d6d914-dbus-socket\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627286 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/476898a0-6b77-4b46-8a73-1a0fa1e336c8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627307 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7x7h\" (UniqueName: \"kubernetes.io/projected/476898a0-6b77-4b46-8a73-1a0fa1e336c8-kube-api-access-q7x7h\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627375 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zj5s\" (UniqueName: \"kubernetes.io/projected/04007580-35e5-42d5-84ec-1e44c4d6d914-kube-api-access-5zj5s\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627420 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/476898a0-6b77-4b46-8a73-1a0fa1e336c8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627441 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/04007580-35e5-42d5-84ec-1e44c4d6d914-ovs-socket\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627539 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/04007580-35e5-42d5-84ec-1e44c4d6d914-ovs-socket\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627599 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/04007580-35e5-42d5-84ec-1e44c4d6d914-nmstate-lock\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.627909 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/04007580-35e5-42d5-84ec-1e44c4d6d914-dbus-socket\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.645535 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zj5s\" (UniqueName: \"kubernetes.io/projected/04007580-35e5-42d5-84ec-1e44c4d6d914-kube-api-access-5zj5s\") pod \"nmstate-handler-gpbvq\" (UID: \"04007580-35e5-42d5-84ec-1e44c4d6d914\") " pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.694504 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.721763 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b6d68495c-g7kzx"] Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.722538 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.729995 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/476898a0-6b77-4b46-8a73-1a0fa1e336c8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.730297 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/476898a0-6b77-4b46-8a73-1a0fa1e336c8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.730392 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7x7h\" (UniqueName: \"kubernetes.io/projected/476898a0-6b77-4b46-8a73-1a0fa1e336c8-kube-api-access-q7x7h\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.732382 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b6d68495c-g7kzx"] Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.752460 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.772297 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7x7h\" (UniqueName: \"kubernetes.io/projected/476898a0-6b77-4b46-8a73-1a0fa1e336c8-kube-api-access-q7x7h\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:34:58 crc kubenswrapper[4773]: W1012 20:34:58.801223 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04007580_35e5_42d5_84ec_1e44c4d6d914.slice/crio-b557f953bdf758789017a74b78fdc203be8c0be8bf1df889944779fcdfb8765e WatchSource:0}: Error finding container b557f953bdf758789017a74b78fdc203be8c0be8bf1df889944779fcdfb8765e: Status 404 returned error can't find the container with id b557f953bdf758789017a74b78fdc203be8c0be8bf1df889944779fcdfb8765e Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.832147 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ce7e787-6f50-493a-9fed-f7b566e29f15-console-serving-cert\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.832187 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-console-config\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.832211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-oauth-serving-cert\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.832259 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-service-ca\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.832291 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw96p\" (UniqueName: \"kubernetes.io/projected/5ce7e787-6f50-493a-9fed-f7b566e29f15-kube-api-access-tw96p\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.832313 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ce7e787-6f50-493a-9fed-f7b566e29f15-console-oauth-config\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.832340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-trusted-ca-bundle\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.933704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ce7e787-6f50-493a-9fed-f7b566e29f15-console-serving-cert\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.934029 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-console-config\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.934054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-oauth-serving-cert\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.934101 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-service-ca\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.934136 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw96p\" (UniqueName: \"kubernetes.io/projected/5ce7e787-6f50-493a-9fed-f7b566e29f15-kube-api-access-tw96p\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.934157 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ce7e787-6f50-493a-9fed-f7b566e29f15-console-oauth-config\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.934184 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-trusted-ca-bundle\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.934968 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-service-ca\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.934999 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-oauth-serving-cert\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.935484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-console-config\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.935686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce7e787-6f50-493a-9fed-f7b566e29f15-trusted-ca-bundle\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.937134 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ce7e787-6f50-493a-9fed-f7b566e29f15-console-oauth-config\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.937789 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ce7e787-6f50-493a-9fed-f7b566e29f15-console-serving-cert\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.948659 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw96p\" (UniqueName: \"kubernetes.io/projected/5ce7e787-6f50-493a-9fed-f7b566e29f15-kube-api-access-tw96p\") pod \"console-7b6d68495c-g7kzx\" (UID: \"5ce7e787-6f50-493a-9fed-f7b566e29f15\") " pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:58 crc kubenswrapper[4773]: I1012 20:34:58.969412 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws"] Oct 12 20:34:58 crc kubenswrapper[4773]: W1012 20:34:58.974263 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod751cb256_7079_497a_a027_a9c295bc9832.slice/crio-d4a2896d52ce903258205feec152e518c9ba270b0c3fd8319c32ba23c12537b8 WatchSource:0}: Error finding container d4a2896d52ce903258205feec152e518c9ba270b0c3fd8319c32ba23c12537b8: Status 404 returned error can't find the container with id d4a2896d52ce903258205feec152e518c9ba270b0c3fd8319c32ba23c12537b8 Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.036266 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fdf1901d-c523-4385-9415-fae96f1ea74c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-wzw64\" (UID: \"fdf1901d-c523-4385-9415-fae96f1ea74c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.040237 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fdf1901d-c523-4385-9415-fae96f1ea74c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-wzw64\" (UID: \"fdf1901d-c523-4385-9415-fae96f1ea74c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.062093 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.259665 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b6d68495c-g7kzx"] Oct 12 20:34:59 crc kubenswrapper[4773]: W1012 20:34:59.267284 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce7e787_6f50_493a_9fed_f7b566e29f15.slice/crio-8217d0c75596530e1e4d3f511f6932fd4461ccdca048656a3c69e2ef2b40f058 WatchSource:0}: Error finding container 8217d0c75596530e1e4d3f511f6932fd4461ccdca048656a3c69e2ef2b40f058: Status 404 returned error can't find the container with id 8217d0c75596530e1e4d3f511f6932fd4461ccdca048656a3c69e2ef2b40f058 Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.305325 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.420469 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gpbvq" event={"ID":"04007580-35e5-42d5-84ec-1e44c4d6d914","Type":"ContainerStarted","Data":"b557f953bdf758789017a74b78fdc203be8c0be8bf1df889944779fcdfb8765e"} Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.421748 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b6d68495c-g7kzx" event={"ID":"5ce7e787-6f50-493a-9fed-f7b566e29f15","Type":"ContainerStarted","Data":"0fa51cb16540ebd3e4346ba1d5d0eae450b8d84dba33f5391475ac8f15bff5a1"} Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.421768 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b6d68495c-g7kzx" event={"ID":"5ce7e787-6f50-493a-9fed-f7b566e29f15","Type":"ContainerStarted","Data":"8217d0c75596530e1e4d3f511f6932fd4461ccdca048656a3c69e2ef2b40f058"} Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.425397 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws" event={"ID":"751cb256-7079-497a-a027-a9c295bc9832","Type":"ContainerStarted","Data":"d4a2896d52ce903258205feec152e518c9ba270b0c3fd8319c32ba23c12537b8"} Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.442878 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b6d68495c-g7kzx" podStartSLOduration=1.442861951 podStartE2EDuration="1.442861951s" podCreationTimestamp="2025-10-12 20:34:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:34:59.440583638 +0000 UTC m=+647.676882218" watchObservedRunningTime="2025-10-12 20:34:59.442861951 +0000 UTC m=+647.679160511" Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.504271 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hz5q8" Oct 12 20:34:59 crc kubenswrapper[4773]: I1012 20:34:59.706936 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64"] Oct 12 20:34:59 crc kubenswrapper[4773]: W1012 20:34:59.713361 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf1901d_c523_4385_9415_fae96f1ea74c.slice/crio-333aefdd65c48a12dd77d4aaf12e21df755c3a9f2a8d6c075f0523c6af3702dc WatchSource:0}: Error finding container 333aefdd65c48a12dd77d4aaf12e21df755c3a9f2a8d6c075f0523c6af3702dc: Status 404 returned error can't find the container with id 333aefdd65c48a12dd77d4aaf12e21df755c3a9f2a8d6c075f0523c6af3702dc Oct 12 20:34:59 crc kubenswrapper[4773]: E1012 20:34:59.731819 4773 configmap.go:193] Couldn't get configMap openshift-nmstate/nginx-conf: failed to sync configmap cache: timed out waiting for the condition Oct 12 20:34:59 crc kubenswrapper[4773]: E1012 20:34:59.731887 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/476898a0-6b77-4b46-8a73-1a0fa1e336c8-nginx-conf podName:476898a0-6b77-4b46-8a73-1a0fa1e336c8 nodeName:}" failed. No retries permitted until 2025-10-12 20:35:00.231868389 +0000 UTC m=+648.468166939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/476898a0-6b77-4b46-8a73-1a0fa1e336c8-nginx-conf") pod "nmstate-console-plugin-6b874cbd85-77gl5" (UID: "476898a0-6b77-4b46-8a73-1a0fa1e336c8") : failed to sync configmap cache: timed out waiting for the condition Oct 12 20:34:59 crc kubenswrapper[4773]: E1012 20:34:59.732046 4773 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 12 20:34:59 crc kubenswrapper[4773]: E1012 20:34:59.732077 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/476898a0-6b77-4b46-8a73-1a0fa1e336c8-plugin-serving-cert podName:476898a0-6b77-4b46-8a73-1a0fa1e336c8 nodeName:}" failed. No retries permitted until 2025-10-12 20:35:00.232070105 +0000 UTC m=+648.468368665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/476898a0-6b77-4b46-8a73-1a0fa1e336c8-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-77gl5" (UID: "476898a0-6b77-4b46-8a73-1a0fa1e336c8") : failed to sync secret cache: timed out waiting for the condition Oct 12 20:35:00 crc kubenswrapper[4773]: I1012 20:35:00.016760 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 12 20:35:00 crc kubenswrapper[4773]: I1012 20:35:00.044266 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 12 20:35:00 crc kubenswrapper[4773]: I1012 20:35:00.251788 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/476898a0-6b77-4b46-8a73-1a0fa1e336c8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:35:00 crc kubenswrapper[4773]: I1012 20:35:00.252243 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/476898a0-6b77-4b46-8a73-1a0fa1e336c8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:35:00 crc kubenswrapper[4773]: I1012 20:35:00.253678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/476898a0-6b77-4b46-8a73-1a0fa1e336c8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:35:00 crc kubenswrapper[4773]: I1012 20:35:00.265775 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/476898a0-6b77-4b46-8a73-1a0fa1e336c8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-77gl5\" (UID: \"476898a0-6b77-4b46-8a73-1a0fa1e336c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:35:00 crc kubenswrapper[4773]: I1012 20:35:00.335975 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" Oct 12 20:35:00 crc kubenswrapper[4773]: I1012 20:35:00.435791 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" event={"ID":"fdf1901d-c523-4385-9415-fae96f1ea74c","Type":"ContainerStarted","Data":"333aefdd65c48a12dd77d4aaf12e21df755c3a9f2a8d6c075f0523c6af3702dc"} Oct 12 20:35:00 crc kubenswrapper[4773]: I1012 20:35:00.543263 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5"] Oct 12 20:35:01 crc kubenswrapper[4773]: I1012 20:35:01.440891 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" event={"ID":"476898a0-6b77-4b46-8a73-1a0fa1e336c8","Type":"ContainerStarted","Data":"022ae44cbb204c998ce71f31c7dd85d92f08347c90c5f0dee5d8bf92178785ed"} Oct 12 20:35:03 crc kubenswrapper[4773]: I1012 20:35:03.452242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gpbvq" event={"ID":"04007580-35e5-42d5-84ec-1e44c4d6d914","Type":"ContainerStarted","Data":"c8a38b6a800e9399f1d7b1739f7036eb97ae3fdcf9cd7d8c77a05652302143c8"} Oct 12 20:35:03 crc kubenswrapper[4773]: I1012 20:35:03.452750 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:35:03 crc kubenswrapper[4773]: I1012 20:35:03.454856 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws" event={"ID":"751cb256-7079-497a-a027-a9c295bc9832","Type":"ContainerStarted","Data":"f1298edea8f0ac1f8d46d04368acabe42de61030b06302e224203d90ebcd43b4"} Oct 12 20:35:03 crc kubenswrapper[4773]: I1012 20:35:03.456654 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" event={"ID":"fdf1901d-c523-4385-9415-fae96f1ea74c","Type":"ContainerStarted","Data":"fce8ab516481e0babe55f95734735cc96c568b6df663d711c0da2e3bdddb3157"} Oct 12 20:35:03 crc kubenswrapper[4773]: I1012 20:35:03.456783 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:35:03 crc kubenswrapper[4773]: I1012 20:35:03.480480 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" podStartSLOduration=2.850827569 podStartE2EDuration="5.480460926s" podCreationTimestamp="2025-10-12 20:34:58 +0000 UTC" firstStartedPulling="2025-10-12 20:34:59.714942638 +0000 UTC m=+647.951241198" lastFinishedPulling="2025-10-12 20:35:02.344575995 +0000 UTC m=+650.580874555" observedRunningTime="2025-10-12 20:35:03.479065627 +0000 UTC m=+651.715364187" watchObservedRunningTime="2025-10-12 20:35:03.480460926 +0000 UTC m=+651.716759486" Oct 12 20:35:03 crc kubenswrapper[4773]: I1012 20:35:03.483466 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-gpbvq" podStartSLOduration=1.926089481 podStartE2EDuration="5.48345669s" podCreationTimestamp="2025-10-12 20:34:58 +0000 UTC" firstStartedPulling="2025-10-12 20:34:58.807698016 +0000 UTC m=+647.043996576" lastFinishedPulling="2025-10-12 20:35:02.365065225 +0000 UTC m=+650.601363785" observedRunningTime="2025-10-12 20:35:03.465614713 +0000 UTC m=+651.701913273" watchObservedRunningTime="2025-10-12 20:35:03.48345669 +0000 UTC m=+651.719755250" Oct 12 20:35:04 crc kubenswrapper[4773]: I1012 20:35:04.465582 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" event={"ID":"476898a0-6b77-4b46-8a73-1a0fa1e336c8","Type":"ContainerStarted","Data":"eea73120685caebde1bf36d3f710af7ce7b90be75dd8b2c83a3696827f5c885a"} Oct 12 20:35:05 crc kubenswrapper[4773]: I1012 20:35:05.474297 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws" event={"ID":"751cb256-7079-497a-a027-a9c295bc9832","Type":"ContainerStarted","Data":"b41f66388d226628e72d1073452e44d708638bacf3c1b5302ed183e24260d0c2"} Oct 12 20:35:05 crc kubenswrapper[4773]: I1012 20:35:05.496580 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-77gl5" podStartSLOduration=4.542541581 podStartE2EDuration="7.496548069s" podCreationTimestamp="2025-10-12 20:34:58 +0000 UTC" firstStartedPulling="2025-10-12 20:35:00.558950402 +0000 UTC m=+648.795248962" lastFinishedPulling="2025-10-12 20:35:03.51295689 +0000 UTC m=+651.749255450" observedRunningTime="2025-10-12 20:35:04.494230252 +0000 UTC m=+652.730528882" watchObservedRunningTime="2025-10-12 20:35:05.496548069 +0000 UTC m=+653.732846669" Oct 12 20:35:05 crc kubenswrapper[4773]: I1012 20:35:05.499131 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-8mnws" podStartSLOduration=1.228426168 podStartE2EDuration="7.499114551s" podCreationTimestamp="2025-10-12 20:34:58 +0000 UTC" firstStartedPulling="2025-10-12 20:34:58.97609416 +0000 UTC m=+647.212392730" lastFinishedPulling="2025-10-12 20:35:05.246782553 +0000 UTC m=+653.483081113" observedRunningTime="2025-10-12 20:35:05.491496559 +0000 UTC m=+653.727795159" watchObservedRunningTime="2025-10-12 20:35:05.499114551 +0000 UTC m=+653.735413151" Oct 12 20:35:08 crc kubenswrapper[4773]: I1012 20:35:08.790543 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-gpbvq" Oct 12 20:35:09 crc kubenswrapper[4773]: I1012 20:35:09.062515 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:35:09 crc kubenswrapper[4773]: I1012 20:35:09.062584 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:35:09 crc kubenswrapper[4773]: I1012 20:35:09.069370 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:35:09 crc kubenswrapper[4773]: I1012 20:35:09.506261 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b6d68495c-g7kzx" Oct 12 20:35:09 crc kubenswrapper[4773]: I1012 20:35:09.569521 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gfc75"] Oct 12 20:35:19 crc kubenswrapper[4773]: I1012 20:35:19.315589 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wzw64" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.612162 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j"] Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.613664 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.615655 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-gfc75" podUID="ad343a90-adad-46cc-b828-93cda758fd2b" containerName="console" containerID="cri-o://0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92" gracePeriod=15 Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.615756 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.680375 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j"] Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.727426 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.727491 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sbrf\" (UniqueName: \"kubernetes.io/projected/e282a672-4919-479e-9bdc-796dc2986e33-kube-api-access-9sbrf\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.727582 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.828805 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.828856 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sbrf\" (UniqueName: \"kubernetes.io/projected/e282a672-4919-479e-9bdc-796dc2986e33-kube-api-access-9sbrf\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.828922 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.829391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.829829 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.852649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sbrf\" (UniqueName: \"kubernetes.io/projected/e282a672-4919-479e-9bdc-796dc2986e33-kube-api-access-9sbrf\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.949968 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gfc75_ad343a90-adad-46cc-b828-93cda758fd2b/console/0.log" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.950038 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:35:34 crc kubenswrapper[4773]: I1012 20:35:34.980869 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.030623 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-oauth-serving-cert\") pod \"ad343a90-adad-46cc-b828-93cda758fd2b\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.031093 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-service-ca\") pod \"ad343a90-adad-46cc-b828-93cda758fd2b\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.031359 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-console-config\") pod \"ad343a90-adad-46cc-b828-93cda758fd2b\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.031609 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-service-ca" (OuterVolumeSpecName: "service-ca") pod "ad343a90-adad-46cc-b828-93cda758fd2b" (UID: "ad343a90-adad-46cc-b828-93cda758fd2b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.031611 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-oauth-config\") pod \"ad343a90-adad-46cc-b828-93cda758fd2b\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.031752 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ad343a90-adad-46cc-b828-93cda758fd2b" (UID: "ad343a90-adad-46cc-b828-93cda758fd2b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.032027 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-console-config" (OuterVolumeSpecName: "console-config") pod "ad343a90-adad-46cc-b828-93cda758fd2b" (UID: "ad343a90-adad-46cc-b828-93cda758fd2b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.032194 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-trusted-ca-bundle\") pod \"ad343a90-adad-46cc-b828-93cda758fd2b\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.032359 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-serving-cert\") pod \"ad343a90-adad-46cc-b828-93cda758fd2b\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.032488 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl6zv\" (UniqueName: \"kubernetes.io/projected/ad343a90-adad-46cc-b828-93cda758fd2b-kube-api-access-tl6zv\") pod \"ad343a90-adad-46cc-b828-93cda758fd2b\" (UID: \"ad343a90-adad-46cc-b828-93cda758fd2b\") " Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.033225 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ad343a90-adad-46cc-b828-93cda758fd2b" (UID: "ad343a90-adad-46cc-b828-93cda758fd2b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.033232 4773 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.033385 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.033395 4773 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-console-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.036068 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ad343a90-adad-46cc-b828-93cda758fd2b" (UID: "ad343a90-adad-46cc-b828-93cda758fd2b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.036622 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ad343a90-adad-46cc-b828-93cda758fd2b" (UID: "ad343a90-adad-46cc-b828-93cda758fd2b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.036782 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad343a90-adad-46cc-b828-93cda758fd2b-kube-api-access-tl6zv" (OuterVolumeSpecName: "kube-api-access-tl6zv") pod "ad343a90-adad-46cc-b828-93cda758fd2b" (UID: "ad343a90-adad-46cc-b828-93cda758fd2b"). InnerVolumeSpecName "kube-api-access-tl6zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.134061 4773 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.134097 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl6zv\" (UniqueName: \"kubernetes.io/projected/ad343a90-adad-46cc-b828-93cda758fd2b-kube-api-access-tl6zv\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.134107 4773 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad343a90-adad-46cc-b828-93cda758fd2b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.134116 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad343a90-adad-46cc-b828-93cda758fd2b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.179087 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j"] Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.694499 4773 generic.go:334] "Generic (PLEG): container finished" podID="e282a672-4919-479e-9bdc-796dc2986e33" containerID="de629891160f741522d8b23e839bafb4951ff987fe6758c29d98caf5e4aba254" exitCode=0 Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.694554 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" event={"ID":"e282a672-4919-479e-9bdc-796dc2986e33","Type":"ContainerDied","Data":"de629891160f741522d8b23e839bafb4951ff987fe6758c29d98caf5e4aba254"} Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.694826 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" event={"ID":"e282a672-4919-479e-9bdc-796dc2986e33","Type":"ContainerStarted","Data":"6faabd1b0b4210b2e91516420124109af6b6666a468ea7f12ea4dd0615b502f7"} Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.700996 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gfc75_ad343a90-adad-46cc-b828-93cda758fd2b/console/0.log" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.701057 4773 generic.go:334] "Generic (PLEG): container finished" podID="ad343a90-adad-46cc-b828-93cda758fd2b" containerID="0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92" exitCode=2 Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.701100 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gfc75" event={"ID":"ad343a90-adad-46cc-b828-93cda758fd2b","Type":"ContainerDied","Data":"0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92"} Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.701143 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gfc75" event={"ID":"ad343a90-adad-46cc-b828-93cda758fd2b","Type":"ContainerDied","Data":"12b88f31266761f23498339f1187d21b723dae12d7caff904f91b33b04d4b6ca"} Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.701172 4773 scope.go:117] "RemoveContainer" containerID="0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.701177 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gfc75" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.741552 4773 scope.go:117] "RemoveContainer" containerID="0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92" Oct 12 20:35:35 crc kubenswrapper[4773]: E1012 20:35:35.742153 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92\": container with ID starting with 0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92 not found: ID does not exist" containerID="0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.742210 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92"} err="failed to get container status \"0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92\": rpc error: code = NotFound desc = could not find container \"0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92\": container with ID starting with 0d254d36ae29d7b3f667ae1c1dfe370f95c455ecefa78fe96b69a8e6a8898a92 not found: ID does not exist" Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.780980 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gfc75"] Oct 12 20:35:35 crc kubenswrapper[4773]: I1012 20:35:35.784577 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-gfc75"] Oct 12 20:35:36 crc kubenswrapper[4773]: I1012 20:35:36.491409 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad343a90-adad-46cc-b828-93cda758fd2b" path="/var/lib/kubelet/pods/ad343a90-adad-46cc-b828-93cda758fd2b/volumes" Oct 12 20:35:37 crc kubenswrapper[4773]: I1012 20:35:37.717405 4773 generic.go:334] "Generic (PLEG): container finished" podID="e282a672-4919-479e-9bdc-796dc2986e33" containerID="eb41469fcfbc098d9d6b9d4a4c50a8d47da9d381c4906f744d4a86f6bd27ae99" exitCode=0 Oct 12 20:35:37 crc kubenswrapper[4773]: I1012 20:35:37.717492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" event={"ID":"e282a672-4919-479e-9bdc-796dc2986e33","Type":"ContainerDied","Data":"eb41469fcfbc098d9d6b9d4a4c50a8d47da9d381c4906f744d4a86f6bd27ae99"} Oct 12 20:35:38 crc kubenswrapper[4773]: I1012 20:35:38.726406 4773 generic.go:334] "Generic (PLEG): container finished" podID="e282a672-4919-479e-9bdc-796dc2986e33" containerID="d36b9edd2b1105c1c521aff9165e952001575bb07fbcbfceb01718859916e926" exitCode=0 Oct 12 20:35:38 crc kubenswrapper[4773]: I1012 20:35:38.726517 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" event={"ID":"e282a672-4919-479e-9bdc-796dc2986e33","Type":"ContainerDied","Data":"d36b9edd2b1105c1c521aff9165e952001575bb07fbcbfceb01718859916e926"} Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.029392 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.138453 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-util\") pod \"e282a672-4919-479e-9bdc-796dc2986e33\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.138550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-bundle\") pod \"e282a672-4919-479e-9bdc-796dc2986e33\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.138609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sbrf\" (UniqueName: \"kubernetes.io/projected/e282a672-4919-479e-9bdc-796dc2986e33-kube-api-access-9sbrf\") pod \"e282a672-4919-479e-9bdc-796dc2986e33\" (UID: \"e282a672-4919-479e-9bdc-796dc2986e33\") " Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.139968 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-bundle" (OuterVolumeSpecName: "bundle") pod "e282a672-4919-479e-9bdc-796dc2986e33" (UID: "e282a672-4919-479e-9bdc-796dc2986e33"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.147631 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e282a672-4919-479e-9bdc-796dc2986e33-kube-api-access-9sbrf" (OuterVolumeSpecName: "kube-api-access-9sbrf") pod "e282a672-4919-479e-9bdc-796dc2986e33" (UID: "e282a672-4919-479e-9bdc-796dc2986e33"). InnerVolumeSpecName "kube-api-access-9sbrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.164066 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-util" (OuterVolumeSpecName: "util") pod "e282a672-4919-479e-9bdc-796dc2986e33" (UID: "e282a672-4919-479e-9bdc-796dc2986e33"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.240606 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sbrf\" (UniqueName: \"kubernetes.io/projected/e282a672-4919-479e-9bdc-796dc2986e33-kube-api-access-9sbrf\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.240651 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-util\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.240670 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e282a672-4919-479e-9bdc-796dc2986e33-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.741552 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" event={"ID":"e282a672-4919-479e-9bdc-796dc2986e33","Type":"ContainerDied","Data":"6faabd1b0b4210b2e91516420124109af6b6666a468ea7f12ea4dd0615b502f7"} Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.742036 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6faabd1b0b4210b2e91516420124109af6b6666a468ea7f12ea4dd0615b502f7" Oct 12 20:35:40 crc kubenswrapper[4773]: I1012 20:35:40.741656 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.484786 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm"] Oct 12 20:35:51 crc kubenswrapper[4773]: E1012 20:35:51.485370 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad343a90-adad-46cc-b828-93cda758fd2b" containerName="console" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.485381 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad343a90-adad-46cc-b828-93cda758fd2b" containerName="console" Oct 12 20:35:51 crc kubenswrapper[4773]: E1012 20:35:51.485399 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e282a672-4919-479e-9bdc-796dc2986e33" containerName="util" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.485405 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e282a672-4919-479e-9bdc-796dc2986e33" containerName="util" Oct 12 20:35:51 crc kubenswrapper[4773]: E1012 20:35:51.485413 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e282a672-4919-479e-9bdc-796dc2986e33" containerName="extract" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.485419 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e282a672-4919-479e-9bdc-796dc2986e33" containerName="extract" Oct 12 20:35:51 crc kubenswrapper[4773]: E1012 20:35:51.485429 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e282a672-4919-479e-9bdc-796dc2986e33" containerName="pull" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.485435 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e282a672-4919-479e-9bdc-796dc2986e33" containerName="pull" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.485691 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad343a90-adad-46cc-b828-93cda758fd2b" containerName="console" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.485704 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e282a672-4919-479e-9bdc-796dc2986e33" containerName="extract" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.486035 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.491978 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-sr5g4" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.492911 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.494336 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.496520 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.500029 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.521420 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm"] Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.588014 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8st8\" (UniqueName: \"kubernetes.io/projected/774b15ab-55ba-42a6-8a77-13690e6aa683-kube-api-access-j8st8\") pod \"metallb-operator-controller-manager-7d86f779f8-r94wm\" (UID: \"774b15ab-55ba-42a6-8a77-13690e6aa683\") " pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.588270 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/774b15ab-55ba-42a6-8a77-13690e6aa683-apiservice-cert\") pod \"metallb-operator-controller-manager-7d86f779f8-r94wm\" (UID: \"774b15ab-55ba-42a6-8a77-13690e6aa683\") " pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.588408 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/774b15ab-55ba-42a6-8a77-13690e6aa683-webhook-cert\") pod \"metallb-operator-controller-manager-7d86f779f8-r94wm\" (UID: \"774b15ab-55ba-42a6-8a77-13690e6aa683\") " pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.690574 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8st8\" (UniqueName: \"kubernetes.io/projected/774b15ab-55ba-42a6-8a77-13690e6aa683-kube-api-access-j8st8\") pod \"metallb-operator-controller-manager-7d86f779f8-r94wm\" (UID: \"774b15ab-55ba-42a6-8a77-13690e6aa683\") " pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.690626 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/774b15ab-55ba-42a6-8a77-13690e6aa683-apiservice-cert\") pod \"metallb-operator-controller-manager-7d86f779f8-r94wm\" (UID: \"774b15ab-55ba-42a6-8a77-13690e6aa683\") " pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.690660 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/774b15ab-55ba-42a6-8a77-13690e6aa683-webhook-cert\") pod \"metallb-operator-controller-manager-7d86f779f8-r94wm\" (UID: \"774b15ab-55ba-42a6-8a77-13690e6aa683\") " pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.696242 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/774b15ab-55ba-42a6-8a77-13690e6aa683-apiservice-cert\") pod \"metallb-operator-controller-manager-7d86f779f8-r94wm\" (UID: \"774b15ab-55ba-42a6-8a77-13690e6aa683\") " pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.705279 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/774b15ab-55ba-42a6-8a77-13690e6aa683-webhook-cert\") pod \"metallb-operator-controller-manager-7d86f779f8-r94wm\" (UID: \"774b15ab-55ba-42a6-8a77-13690e6aa683\") " pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.715380 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8st8\" (UniqueName: \"kubernetes.io/projected/774b15ab-55ba-42a6-8a77-13690e6aa683-kube-api-access-j8st8\") pod \"metallb-operator-controller-manager-7d86f779f8-r94wm\" (UID: \"774b15ab-55ba-42a6-8a77-13690e6aa683\") " pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.799978 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.807903 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx"] Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.808578 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.818821 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-kr6bw" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.819107 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.819334 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.828936 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx"] Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.892887 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/180f9b25-f871-4854-b535-73fd6bd1d7f0-webhook-cert\") pod \"metallb-operator-webhook-server-d8b4c7c74-pbqqx\" (UID: \"180f9b25-f871-4854-b535-73fd6bd1d7f0\") " pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.892938 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/180f9b25-f871-4854-b535-73fd6bd1d7f0-apiservice-cert\") pod \"metallb-operator-webhook-server-d8b4c7c74-pbqqx\" (UID: \"180f9b25-f871-4854-b535-73fd6bd1d7f0\") " pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.892969 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz4mk\" (UniqueName: \"kubernetes.io/projected/180f9b25-f871-4854-b535-73fd6bd1d7f0-kube-api-access-xz4mk\") pod \"metallb-operator-webhook-server-d8b4c7c74-pbqqx\" (UID: \"180f9b25-f871-4854-b535-73fd6bd1d7f0\") " pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.995469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/180f9b25-f871-4854-b535-73fd6bd1d7f0-webhook-cert\") pod \"metallb-operator-webhook-server-d8b4c7c74-pbqqx\" (UID: \"180f9b25-f871-4854-b535-73fd6bd1d7f0\") " pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.995798 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/180f9b25-f871-4854-b535-73fd6bd1d7f0-apiservice-cert\") pod \"metallb-operator-webhook-server-d8b4c7c74-pbqqx\" (UID: \"180f9b25-f871-4854-b535-73fd6bd1d7f0\") " pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:51 crc kubenswrapper[4773]: I1012 20:35:51.995831 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz4mk\" (UniqueName: \"kubernetes.io/projected/180f9b25-f871-4854-b535-73fd6bd1d7f0-kube-api-access-xz4mk\") pod \"metallb-operator-webhook-server-d8b4c7c74-pbqqx\" (UID: \"180f9b25-f871-4854-b535-73fd6bd1d7f0\") " pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:52 crc kubenswrapper[4773]: I1012 20:35:52.003489 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/180f9b25-f871-4854-b535-73fd6bd1d7f0-webhook-cert\") pod \"metallb-operator-webhook-server-d8b4c7c74-pbqqx\" (UID: \"180f9b25-f871-4854-b535-73fd6bd1d7f0\") " pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:52 crc kubenswrapper[4773]: I1012 20:35:52.009746 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/180f9b25-f871-4854-b535-73fd6bd1d7f0-apiservice-cert\") pod \"metallb-operator-webhook-server-d8b4c7c74-pbqqx\" (UID: \"180f9b25-f871-4854-b535-73fd6bd1d7f0\") " pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:52 crc kubenswrapper[4773]: I1012 20:35:52.016946 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz4mk\" (UniqueName: \"kubernetes.io/projected/180f9b25-f871-4854-b535-73fd6bd1d7f0-kube-api-access-xz4mk\") pod \"metallb-operator-webhook-server-d8b4c7c74-pbqqx\" (UID: \"180f9b25-f871-4854-b535-73fd6bd1d7f0\") " pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:52 crc kubenswrapper[4773]: I1012 20:35:52.139020 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm"] Oct 12 20:35:52 crc kubenswrapper[4773]: I1012 20:35:52.156257 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:52 crc kubenswrapper[4773]: I1012 20:35:52.357182 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx"] Oct 12 20:35:52 crc kubenswrapper[4773]: I1012 20:35:52.802085 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" event={"ID":"180f9b25-f871-4854-b535-73fd6bd1d7f0","Type":"ContainerStarted","Data":"3b507311110d37ddb86dac70e68574116229f0a80d8474e4f83990dd413d8b55"} Oct 12 20:35:52 crc kubenswrapper[4773]: I1012 20:35:52.803574 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" event={"ID":"774b15ab-55ba-42a6-8a77-13690e6aa683","Type":"ContainerStarted","Data":"485d1b80a415b20f4cb82de6104b6d9a91c36a7b69431250d32cf9ff38a4d1e2"} Oct 12 20:35:57 crc kubenswrapper[4773]: I1012 20:35:57.840607 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" event={"ID":"180f9b25-f871-4854-b535-73fd6bd1d7f0","Type":"ContainerStarted","Data":"083d9f64fde20bc69a027ed1e1563238b2d203eee1ce0378bb38235c2d0ca699"} Oct 12 20:35:57 crc kubenswrapper[4773]: I1012 20:35:57.841088 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:35:57 crc kubenswrapper[4773]: I1012 20:35:57.841790 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" event={"ID":"774b15ab-55ba-42a6-8a77-13690e6aa683","Type":"ContainerStarted","Data":"04cc230c7b21a433377dab8880a375d6c04a6d299238a91653b59322a1f69079"} Oct 12 20:35:57 crc kubenswrapper[4773]: I1012 20:35:57.842082 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:35:57 crc kubenswrapper[4773]: I1012 20:35:57.884123 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" podStartSLOduration=1.912774095 podStartE2EDuration="6.884110669s" podCreationTimestamp="2025-10-12 20:35:51 +0000 UTC" firstStartedPulling="2025-10-12 20:35:52.372418526 +0000 UTC m=+700.608717086" lastFinishedPulling="2025-10-12 20:35:57.3437551 +0000 UTC m=+705.580053660" observedRunningTime="2025-10-12 20:35:57.881205998 +0000 UTC m=+706.117504558" watchObservedRunningTime="2025-10-12 20:35:57.884110669 +0000 UTC m=+706.120409229" Oct 12 20:35:57 crc kubenswrapper[4773]: I1012 20:35:57.904633 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" podStartSLOduration=1.7551344599999998 podStartE2EDuration="6.904617949s" podCreationTimestamp="2025-10-12 20:35:51 +0000 UTC" firstStartedPulling="2025-10-12 20:35:52.177988759 +0000 UTC m=+700.414287319" lastFinishedPulling="2025-10-12 20:35:57.327472248 +0000 UTC m=+705.563770808" observedRunningTime="2025-10-12 20:35:57.90103695 +0000 UTC m=+706.137335510" watchObservedRunningTime="2025-10-12 20:35:57.904617949 +0000 UTC m=+706.140916509" Oct 12 20:35:58 crc kubenswrapper[4773]: I1012 20:35:58.670065 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:35:58 crc kubenswrapper[4773]: I1012 20:35:58.670386 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:36:12 crc kubenswrapper[4773]: I1012 20:36:12.161277 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-d8b4c7c74-pbqqx" Oct 12 20:36:28 crc kubenswrapper[4773]: I1012 20:36:28.669839 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:36:28 crc kubenswrapper[4773]: I1012 20:36:28.670523 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:36:31 crc kubenswrapper[4773]: I1012 20:36:31.803435 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d86f779f8-r94wm" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.681939 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-clw9f"] Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.684108 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.689035 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.689207 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7tmjd" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.689080 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.691446 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5"] Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.692432 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.694016 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.702114 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5"] Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.787213 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-df4jg"] Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.788066 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-df4jg" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.790273 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.790586 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.790747 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zfqp6" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.791042 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.792583 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-frr-conf\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.792618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fgzk\" (UniqueName: \"kubernetes.io/projected/d895af47-6572-42a4-805b-56be09e5e40c-kube-api-access-6fgzk\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.792669 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d895af47-6572-42a4-805b-56be09e5e40c-metrics-certs\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.792686 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-metrics\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.792716 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d895af47-6572-42a4-805b-56be09e5e40c-frr-startup\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.792750 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-reloader\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.792766 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-frr-sockets\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.803080 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-mcbq6"] Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.803922 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.805796 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.822680 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-mcbq6"] Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899783 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c120d38-3572-486b-9b37-946d2358e130-metrics-certs\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899836 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d895af47-6572-42a4-805b-56be09e5e40c-metrics-certs\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899855 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5c70a42a-d5f5-4b1d-b23b-cd672597789c-metallb-excludel2\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899874 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-metrics\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899893 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g48x\" (UniqueName: \"kubernetes.io/projected/aba7a037-467a-40bd-b2e5-4c446be76185-kube-api-access-2g48x\") pod \"frr-k8s-webhook-server-64bf5d555-sq7f5\" (UID: \"aba7a037-467a-40bd-b2e5-4c446be76185\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899915 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-metrics-certs\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899930 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j746\" (UniqueName: \"kubernetes.io/projected/5c70a42a-d5f5-4b1d-b23b-cd672597789c-kube-api-access-4j746\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899955 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d895af47-6572-42a4-805b-56be09e5e40c-frr-startup\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899972 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aba7a037-467a-40bd-b2e5-4c446be76185-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sq7f5\" (UID: \"aba7a037-467a-40bd-b2e5-4c446be76185\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.899991 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-reloader\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.900007 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-frr-sockets\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.900034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c120d38-3572-486b-9b37-946d2358e130-cert\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.900062 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-frr-conf\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.900084 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fgzk\" (UniqueName: \"kubernetes.io/projected/d895af47-6572-42a4-805b-56be09e5e40c-kube-api-access-6fgzk\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.900116 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s465k\" (UniqueName: \"kubernetes.io/projected/9c120d38-3572-486b-9b37-946d2358e130-kube-api-access-s465k\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.900134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:32 crc kubenswrapper[4773]: E1012 20:36:32.900264 4773 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 12 20:36:32 crc kubenswrapper[4773]: E1012 20:36:32.900303 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d895af47-6572-42a4-805b-56be09e5e40c-metrics-certs podName:d895af47-6572-42a4-805b-56be09e5e40c nodeName:}" failed. No retries permitted until 2025-10-12 20:36:33.400288046 +0000 UTC m=+741.636586606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d895af47-6572-42a4-805b-56be09e5e40c-metrics-certs") pod "frr-k8s-clw9f" (UID: "d895af47-6572-42a4-805b-56be09e5e40c") : secret "frr-k8s-certs-secret" not found Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.900796 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-metrics\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.901429 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d895af47-6572-42a4-805b-56be09e5e40c-frr-startup\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.901606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-reloader\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.901795 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-frr-sockets\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.901960 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d895af47-6572-42a4-805b-56be09e5e40c-frr-conf\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:32 crc kubenswrapper[4773]: I1012 20:36:32.934005 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fgzk\" (UniqueName: \"kubernetes.io/projected/d895af47-6572-42a4-805b-56be09e5e40c-kube-api-access-6fgzk\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.001908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g48x\" (UniqueName: \"kubernetes.io/projected/aba7a037-467a-40bd-b2e5-4c446be76185-kube-api-access-2g48x\") pod \"frr-k8s-webhook-server-64bf5d555-sq7f5\" (UID: \"aba7a037-467a-40bd-b2e5-4c446be76185\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.002267 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-metrics-certs\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.002295 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j746\" (UniqueName: \"kubernetes.io/projected/5c70a42a-d5f5-4b1d-b23b-cd672597789c-kube-api-access-4j746\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.002343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aba7a037-467a-40bd-b2e5-4c446be76185-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sq7f5\" (UID: \"aba7a037-467a-40bd-b2e5-4c446be76185\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.002421 4773 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.002501 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-metrics-certs podName:5c70a42a-d5f5-4b1d-b23b-cd672597789c nodeName:}" failed. No retries permitted until 2025-10-12 20:36:33.502482928 +0000 UTC m=+741.738781488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-metrics-certs") pod "speaker-df4jg" (UID: "5c70a42a-d5f5-4b1d-b23b-cd672597789c") : secret "speaker-certs-secret" not found Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.002510 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c120d38-3572-486b-9b37-946d2358e130-cert\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.002522 4773 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.002596 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aba7a037-467a-40bd-b2e5-4c446be76185-cert podName:aba7a037-467a-40bd-b2e5-4c446be76185 nodeName:}" failed. No retries permitted until 2025-10-12 20:36:33.502577261 +0000 UTC m=+741.738875821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aba7a037-467a-40bd-b2e5-4c446be76185-cert") pod "frr-k8s-webhook-server-64bf5d555-sq7f5" (UID: "aba7a037-467a-40bd-b2e5-4c446be76185") : secret "frr-k8s-webhook-server-cert" not found Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.002624 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s465k\" (UniqueName: \"kubernetes.io/projected/9c120d38-3572-486b-9b37-946d2358e130-kube-api-access-s465k\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.002661 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.002686 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c120d38-3572-486b-9b37-946d2358e130-metrics-certs\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.002751 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5c70a42a-d5f5-4b1d-b23b-cd672597789c-metallb-excludel2\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.002770 4773 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.002781 4773 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.002814 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist podName:5c70a42a-d5f5-4b1d-b23b-cd672597789c nodeName:}" failed. No retries permitted until 2025-10-12 20:36:33.502798347 +0000 UTC m=+741.739096907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist") pod "speaker-df4jg" (UID: "5c70a42a-d5f5-4b1d-b23b-cd672597789c") : secret "metallb-memberlist" not found Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.002832 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c120d38-3572-486b-9b37-946d2358e130-metrics-certs podName:9c120d38-3572-486b-9b37-946d2358e130 nodeName:}" failed. No retries permitted until 2025-10-12 20:36:33.502825458 +0000 UTC m=+741.739124018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c120d38-3572-486b-9b37-946d2358e130-metrics-certs") pod "controller-68d546b9d8-mcbq6" (UID: "9c120d38-3572-486b-9b37-946d2358e130") : secret "controller-certs-secret" not found Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.003391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5c70a42a-d5f5-4b1d-b23b-cd672597789c-metallb-excludel2\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.004662 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.028370 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c120d38-3572-486b-9b37-946d2358e130-cert\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.034319 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j746\" (UniqueName: \"kubernetes.io/projected/5c70a42a-d5f5-4b1d-b23b-cd672597789c-kube-api-access-4j746\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.035446 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s465k\" (UniqueName: \"kubernetes.io/projected/9c120d38-3572-486b-9b37-946d2358e130-kube-api-access-s465k\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.036810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g48x\" (UniqueName: \"kubernetes.io/projected/aba7a037-467a-40bd-b2e5-4c446be76185-kube-api-access-2g48x\") pod \"frr-k8s-webhook-server-64bf5d555-sq7f5\" (UID: \"aba7a037-467a-40bd-b2e5-4c446be76185\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.405945 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d895af47-6572-42a4-805b-56be09e5e40c-metrics-certs\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.419279 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d895af47-6572-42a4-805b-56be09e5e40c-metrics-certs\") pod \"frr-k8s-clw9f\" (UID: \"d895af47-6572-42a4-805b-56be09e5e40c\") " pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.507050 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.507109 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c120d38-3572-486b-9b37-946d2358e130-metrics-certs\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.507167 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-metrics-certs\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.507235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aba7a037-467a-40bd-b2e5-4c446be76185-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sq7f5\" (UID: \"aba7a037-467a-40bd-b2e5-4c446be76185\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.507828 4773 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 12 20:36:33 crc kubenswrapper[4773]: E1012 20:36:33.507880 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist podName:5c70a42a-d5f5-4b1d-b23b-cd672597789c nodeName:}" failed. No retries permitted until 2025-10-12 20:36:34.507866264 +0000 UTC m=+742.744164824 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist") pod "speaker-df4jg" (UID: "5c70a42a-d5f5-4b1d-b23b-cd672597789c") : secret "metallb-memberlist" not found Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.510631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c120d38-3572-486b-9b37-946d2358e130-metrics-certs\") pod \"controller-68d546b9d8-mcbq6\" (UID: \"9c120d38-3572-486b-9b37-946d2358e130\") " pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.511539 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aba7a037-467a-40bd-b2e5-4c446be76185-cert\") pod \"frr-k8s-webhook-server-64bf5d555-sq7f5\" (UID: \"aba7a037-467a-40bd-b2e5-4c446be76185\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.511804 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-metrics-certs\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.603209 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.609163 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:33 crc kubenswrapper[4773]: I1012 20:36:33.716888 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:34 crc kubenswrapper[4773]: I1012 20:36:34.044223 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5"] Oct 12 20:36:34 crc kubenswrapper[4773]: I1012 20:36:34.045667 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerStarted","Data":"0a831e8a601d9a5b6cfcb9bcc5edd480284a6217a14fbdb31415d88dc2e3725a"} Oct 12 20:36:34 crc kubenswrapper[4773]: I1012 20:36:34.207052 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-mcbq6"] Oct 12 20:36:34 crc kubenswrapper[4773]: W1012 20:36:34.209960 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c120d38_3572_486b_9b37_946d2358e130.slice/crio-0edd329831286f503d8dc0b44bda77bbcb9ead703fbd7a1dff065b26a053671b WatchSource:0}: Error finding container 0edd329831286f503d8dc0b44bda77bbcb9ead703fbd7a1dff065b26a053671b: Status 404 returned error can't find the container with id 0edd329831286f503d8dc0b44bda77bbcb9ead703fbd7a1dff065b26a053671b Oct 12 20:36:34 crc kubenswrapper[4773]: I1012 20:36:34.519183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:34 crc kubenswrapper[4773]: E1012 20:36:34.519387 4773 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 12 20:36:34 crc kubenswrapper[4773]: E1012 20:36:34.519477 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist podName:5c70a42a-d5f5-4b1d-b23b-cd672597789c nodeName:}" failed. No retries permitted until 2025-10-12 20:36:36.519454979 +0000 UTC m=+744.755753639 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist") pod "speaker-df4jg" (UID: "5c70a42a-d5f5-4b1d-b23b-cd672597789c") : secret "metallb-memberlist" not found Oct 12 20:36:35 crc kubenswrapper[4773]: I1012 20:36:35.050503 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-mcbq6" event={"ID":"9c120d38-3572-486b-9b37-946d2358e130","Type":"ContainerStarted","Data":"c32d1b43a5ccba1e966afc2c75c4998386045db06df4867c0832ec43b1863a80"} Oct 12 20:36:35 crc kubenswrapper[4773]: I1012 20:36:35.050816 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-mcbq6" event={"ID":"9c120d38-3572-486b-9b37-946d2358e130","Type":"ContainerStarted","Data":"6848119c97486d3f470ba9ab6d413f687b1bbdac4aedbe1ab279b2f88f46814f"} Oct 12 20:36:35 crc kubenswrapper[4773]: I1012 20:36:35.050827 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-mcbq6" event={"ID":"9c120d38-3572-486b-9b37-946d2358e130","Type":"ContainerStarted","Data":"0edd329831286f503d8dc0b44bda77bbcb9ead703fbd7a1dff065b26a053671b"} Oct 12 20:36:35 crc kubenswrapper[4773]: I1012 20:36:35.051638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:35 crc kubenswrapper[4773]: I1012 20:36:35.052475 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" event={"ID":"aba7a037-467a-40bd-b2e5-4c446be76185","Type":"ContainerStarted","Data":"66d84f54b300a74083850401c6a5303fd59392182335ae08667f625eae0569aa"} Oct 12 20:36:36 crc kubenswrapper[4773]: I1012 20:36:36.546427 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:36 crc kubenswrapper[4773]: I1012 20:36:36.561432 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c70a42a-d5f5-4b1d-b23b-cd672597789c-memberlist\") pod \"speaker-df4jg\" (UID: \"5c70a42a-d5f5-4b1d-b23b-cd672597789c\") " pod="metallb-system/speaker-df4jg" Oct 12 20:36:36 crc kubenswrapper[4773]: I1012 20:36:36.700294 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-df4jg" Oct 12 20:36:36 crc kubenswrapper[4773]: W1012 20:36:36.728526 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c70a42a_d5f5_4b1d_b23b_cd672597789c.slice/crio-c374939ab1e654c98ae8343a1751d21294ecf247b3613decfc91d74b68ef9c24 WatchSource:0}: Error finding container c374939ab1e654c98ae8343a1751d21294ecf247b3613decfc91d74b68ef9c24: Status 404 returned error can't find the container with id c374939ab1e654c98ae8343a1751d21294ecf247b3613decfc91d74b68ef9c24 Oct 12 20:36:37 crc kubenswrapper[4773]: I1012 20:36:37.069084 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-df4jg" event={"ID":"5c70a42a-d5f5-4b1d-b23b-cd672597789c","Type":"ContainerStarted","Data":"75f4632ac3be54148a781f896c0f8a74220d0e8c7d2fd0deb93dc442b44206b1"} Oct 12 20:36:37 crc kubenswrapper[4773]: I1012 20:36:37.069340 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-df4jg" event={"ID":"5c70a42a-d5f5-4b1d-b23b-cd672597789c","Type":"ContainerStarted","Data":"c374939ab1e654c98ae8343a1751d21294ecf247b3613decfc91d74b68ef9c24"} Oct 12 20:36:38 crc kubenswrapper[4773]: I1012 20:36:38.074235 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-df4jg" event={"ID":"5c70a42a-d5f5-4b1d-b23b-cd672597789c","Type":"ContainerStarted","Data":"37c1ba48de992822ca747c36706fc1bfa35621628cfa686c9e43d6af2ed966b2"} Oct 12 20:36:38 crc kubenswrapper[4773]: I1012 20:36:38.074913 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-df4jg" Oct 12 20:36:38 crc kubenswrapper[4773]: I1012 20:36:38.091555 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-mcbq6" podStartSLOduration=6.091532067 podStartE2EDuration="6.091532067s" podCreationTimestamp="2025-10-12 20:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:36:35.067981835 +0000 UTC m=+743.304280395" watchObservedRunningTime="2025-10-12 20:36:38.091532067 +0000 UTC m=+746.327830627" Oct 12 20:36:40 crc kubenswrapper[4773]: I1012 20:36:40.902314 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-df4jg" podStartSLOduration=8.902291861 podStartE2EDuration="8.902291861s" podCreationTimestamp="2025-10-12 20:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:36:38.09484725 +0000 UTC m=+746.331145810" watchObservedRunningTime="2025-10-12 20:36:40.902291861 +0000 UTC m=+749.138590421" Oct 12 20:36:40 crc kubenswrapper[4773]: I1012 20:36:40.905812 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wc5dc"] Oct 12 20:36:40 crc kubenswrapper[4773]: I1012 20:36:40.906069 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" podUID="cd934a31-cbf9-4b33-831c-2622adbe4f76" containerName="controller-manager" containerID="cri-o://2d6bec76dd71d7441b8fc6418295b20fb8a5a7404e88aebc0bb2792e964d73d2" gracePeriod=30 Oct 12 20:36:40 crc kubenswrapper[4773]: I1012 20:36:40.999636 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2"] Oct 12 20:36:41 crc kubenswrapper[4773]: I1012 20:36:40.999880 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" podUID="d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" containerName="route-controller-manager" containerID="cri-o://dec783533cfe12a04aae6e622f314f2288413256bca182436f8edf10b5f63246" gracePeriod=30 Oct 12 20:36:41 crc kubenswrapper[4773]: I1012 20:36:41.105563 4773 generic.go:334] "Generic (PLEG): container finished" podID="cd934a31-cbf9-4b33-831c-2622adbe4f76" containerID="2d6bec76dd71d7441b8fc6418295b20fb8a5a7404e88aebc0bb2792e964d73d2" exitCode=0 Oct 12 20:36:41 crc kubenswrapper[4773]: I1012 20:36:41.105604 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" event={"ID":"cd934a31-cbf9-4b33-831c-2622adbe4f76","Type":"ContainerDied","Data":"2d6bec76dd71d7441b8fc6418295b20fb8a5a7404e88aebc0bb2792e964d73d2"} Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.113657 4773 generic.go:334] "Generic (PLEG): container finished" podID="d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" containerID="dec783533cfe12a04aae6e622f314f2288413256bca182436f8edf10b5f63246" exitCode=0 Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.113995 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" event={"ID":"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6","Type":"ContainerDied","Data":"dec783533cfe12a04aae6e622f314f2288413256bca182436f8edf10b5f63246"} Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.257837 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.274272 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.314585 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4"] Oct 12 20:36:42 crc kubenswrapper[4773]: E1012 20:36:42.314892 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" containerName="route-controller-manager" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.314911 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" containerName="route-controller-manager" Oct 12 20:36:42 crc kubenswrapper[4773]: E1012 20:36:42.314933 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd934a31-cbf9-4b33-831c-2622adbe4f76" containerName="controller-manager" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.314941 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd934a31-cbf9-4b33-831c-2622adbe4f76" containerName="controller-manager" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.315070 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd934a31-cbf9-4b33-831c-2622adbe4f76" containerName="controller-manager" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.315087 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" containerName="route-controller-manager" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.315546 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336287 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-config\") pod \"cd934a31-cbf9-4b33-831c-2622adbe4f76\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336341 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xblzk\" (UniqueName: \"kubernetes.io/projected/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-kube-api-access-xblzk\") pod \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336364 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7htq\" (UniqueName: \"kubernetes.io/projected/cd934a31-cbf9-4b33-831c-2622adbe4f76-kube-api-access-n7htq\") pod \"cd934a31-cbf9-4b33-831c-2622adbe4f76\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336424 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-proxy-ca-bundles\") pod \"cd934a31-cbf9-4b33-831c-2622adbe4f76\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336456 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-client-ca\") pod \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336519 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd934a31-cbf9-4b33-831c-2622adbe4f76-serving-cert\") pod \"cd934a31-cbf9-4b33-831c-2622adbe4f76\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336543 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-config\") pod \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336559 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-serving-cert\") pod \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\" (UID: \"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6\") " Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336599 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-client-ca\") pod \"cd934a31-cbf9-4b33-831c-2622adbe4f76\" (UID: \"cd934a31-cbf9-4b33-831c-2622adbe4f76\") " Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336779 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7551dc5-7a42-4655-ab65-3f0ceca2d791-serving-cert\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336809 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7551dc5-7a42-4655-ab65-3f0ceca2d791-config\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336831 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7551dc5-7a42-4655-ab65-3f0ceca2d791-client-ca\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.336904 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj52g\" (UniqueName: \"kubernetes.io/projected/f7551dc5-7a42-4655-ab65-3f0ceca2d791-kube-api-access-vj52g\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.338443 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-config" (OuterVolumeSpecName: "config") pod "d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" (UID: "d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.340088 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-config" (OuterVolumeSpecName: "config") pod "cd934a31-cbf9-4b33-831c-2622adbe4f76" (UID: "cd934a31-cbf9-4b33-831c-2622adbe4f76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.340543 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cd934a31-cbf9-4b33-831c-2622adbe4f76" (UID: "cd934a31-cbf9-4b33-831c-2622adbe4f76"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.342886 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-client-ca" (OuterVolumeSpecName: "client-ca") pod "d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" (UID: "d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.343363 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-client-ca" (OuterVolumeSpecName: "client-ca") pod "cd934a31-cbf9-4b33-831c-2622adbe4f76" (UID: "cd934a31-cbf9-4b33-831c-2622adbe4f76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.370977 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd934a31-cbf9-4b33-831c-2622adbe4f76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cd934a31-cbf9-4b33-831c-2622adbe4f76" (UID: "cd934a31-cbf9-4b33-831c-2622adbe4f76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.371847 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-kube-api-access-xblzk" (OuterVolumeSpecName: "kube-api-access-xblzk") pod "d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" (UID: "d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6"). InnerVolumeSpecName "kube-api-access-xblzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.379652 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4"] Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.380150 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd934a31-cbf9-4b33-831c-2622adbe4f76-kube-api-access-n7htq" (OuterVolumeSpecName: "kube-api-access-n7htq") pod "cd934a31-cbf9-4b33-831c-2622adbe4f76" (UID: "cd934a31-cbf9-4b33-831c-2622adbe4f76"). InnerVolumeSpecName "kube-api-access-n7htq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.382827 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" (UID: "d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.437866 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7551dc5-7a42-4655-ab65-3f0ceca2d791-serving-cert\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438251 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7551dc5-7a42-4655-ab65-3f0ceca2d791-client-ca\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438277 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7551dc5-7a42-4655-ab65-3f0ceca2d791-config\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438347 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj52g\" (UniqueName: \"kubernetes.io/projected/f7551dc5-7a42-4655-ab65-3f0ceca2d791-kube-api-access-vj52g\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438408 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438424 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xblzk\" (UniqueName: \"kubernetes.io/projected/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-kube-api-access-xblzk\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438439 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7htq\" (UniqueName: \"kubernetes.io/projected/cd934a31-cbf9-4b33-831c-2622adbe4f76-kube-api-access-n7htq\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438453 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438464 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438475 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd934a31-cbf9-4b33-831c-2622adbe4f76-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438488 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438498 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.438509 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd934a31-cbf9-4b33-831c-2622adbe4f76-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.439088 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7551dc5-7a42-4655-ab65-3f0ceca2d791-client-ca\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.439558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7551dc5-7a42-4655-ab65-3f0ceca2d791-config\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.441455 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7551dc5-7a42-4655-ab65-3f0ceca2d791-serving-cert\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.454863 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj52g\" (UniqueName: \"kubernetes.io/projected/f7551dc5-7a42-4655-ab65-3f0ceca2d791-kube-api-access-vj52g\") pod \"route-controller-manager-5b4b4996bb-s77k4\" (UID: \"f7551dc5-7a42-4655-ab65-3f0ceca2d791\") " pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:42 crc kubenswrapper[4773]: I1012 20:36:42.643449 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.052082 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4"] Oct 12 20:36:43 crc kubenswrapper[4773]: W1012 20:36:43.060842 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7551dc5_7a42_4655_ab65_3f0ceca2d791.slice/crio-ad451642c128cfc44fbc2f2f2c1de3279a7e4190f3af88d24f2c3f0b8e227312 WatchSource:0}: Error finding container ad451642c128cfc44fbc2f2f2c1de3279a7e4190f3af88d24f2c3f0b8e227312: Status 404 returned error can't find the container with id ad451642c128cfc44fbc2f2f2c1de3279a7e4190f3af88d24f2c3f0b8e227312 Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.120499 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" event={"ID":"f7551dc5-7a42-4655-ab65-3f0ceca2d791","Type":"ContainerStarted","Data":"ad451642c128cfc44fbc2f2f2c1de3279a7e4190f3af88d24f2c3f0b8e227312"} Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.121906 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.122375 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2" event={"ID":"d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6","Type":"ContainerDied","Data":"a674099f124e2f2d1615f7b84efdf57225669e523aa3f3160d78501ab8bf41c5"} Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.122399 4773 scope.go:117] "RemoveContainer" containerID="dec783533cfe12a04aae6e622f314f2288413256bca182436f8edf10b5f63246" Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.124222 4773 generic.go:334] "Generic (PLEG): container finished" podID="d895af47-6572-42a4-805b-56be09e5e40c" containerID="9645e38b00900668c2b51adf1f3a2db115c29185964e3f7dca137eda6e11b52e" exitCode=0 Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.124264 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerDied","Data":"9645e38b00900668c2b51adf1f3a2db115c29185964e3f7dca137eda6e11b52e"} Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.126252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" event={"ID":"aba7a037-467a-40bd-b2e5-4c446be76185","Type":"ContainerStarted","Data":"08bf0a968e2cad1e63d043b710fd83119ac0950b60a1c9b4232001a92582e7eb"} Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.126584 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.129296 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" event={"ID":"cd934a31-cbf9-4b33-831c-2622adbe4f76","Type":"ContainerDied","Data":"9c68021a06d3a8605ef7471383cebb1ed3502738b10df5f23bd68ea331a69ccf"} Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.129376 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wc5dc" Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.137676 4773 scope.go:117] "RemoveContainer" containerID="2d6bec76dd71d7441b8fc6418295b20fb8a5a7404e88aebc0bb2792e964d73d2" Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.171377 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wc5dc"] Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.180699 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wc5dc"] Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.203822 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" podStartSLOduration=3.178308623 podStartE2EDuration="11.203808402s" podCreationTimestamp="2025-10-12 20:36:32 +0000 UTC" firstStartedPulling="2025-10-12 20:36:34.056916915 +0000 UTC m=+742.293215475" lastFinishedPulling="2025-10-12 20:36:42.082416694 +0000 UTC m=+750.318715254" observedRunningTime="2025-10-12 20:36:43.202000781 +0000 UTC m=+751.438299341" watchObservedRunningTime="2025-10-12 20:36:43.203808402 +0000 UTC m=+751.440106962" Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.212463 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2"] Oct 12 20:36:43 crc kubenswrapper[4773]: I1012 20:36:43.215238 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpct2"] Oct 12 20:36:44 crc kubenswrapper[4773]: I1012 20:36:44.136238 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" event={"ID":"f7551dc5-7a42-4655-ab65-3f0ceca2d791","Type":"ContainerStarted","Data":"de3a098b8f438da7d01be38d016d07aa02ea9166aa0f9e2f5dc0af7f178379d7"} Oct 12 20:36:44 crc kubenswrapper[4773]: I1012 20:36:44.136734 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:44 crc kubenswrapper[4773]: I1012 20:36:44.139003 4773 generic.go:334] "Generic (PLEG): container finished" podID="d895af47-6572-42a4-805b-56be09e5e40c" containerID="86439f72b169f149c68d81b2cd6335d42f858563cdb883faf65092e664fa7e1f" exitCode=0 Oct 12 20:36:44 crc kubenswrapper[4773]: I1012 20:36:44.139274 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerDied","Data":"86439f72b169f149c68d81b2cd6335d42f858563cdb883faf65092e664fa7e1f"} Oct 12 20:36:44 crc kubenswrapper[4773]: I1012 20:36:44.142488 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" Oct 12 20:36:44 crc kubenswrapper[4773]: I1012 20:36:44.157341 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b4b4996bb-s77k4" podStartSLOduration=3.157324801 podStartE2EDuration="3.157324801s" podCreationTimestamp="2025-10-12 20:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:36:44.154970716 +0000 UTC m=+752.391269276" watchObservedRunningTime="2025-10-12 20:36:44.157324801 +0000 UTC m=+752.393623361" Oct 12 20:36:44 crc kubenswrapper[4773]: I1012 20:36:44.503473 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd934a31-cbf9-4b33-831c-2622adbe4f76" path="/var/lib/kubelet/pods/cd934a31-cbf9-4b33-831c-2622adbe4f76/volumes" Oct 12 20:36:44 crc kubenswrapper[4773]: I1012 20:36:44.504896 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6" path="/var/lib/kubelet/pods/d5c2c176-d09a-48cd-9b01-4e7eda5a6ca6/volumes" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.118554 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz"] Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.120461 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.123617 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.123947 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.124278 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.124499 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.124788 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.125682 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.138136 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.138599 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz"] Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.158061 4773 generic.go:334] "Generic (PLEG): container finished" podID="d895af47-6572-42a4-805b-56be09e5e40c" containerID="c2612cf8e6ff562933bd51a047a8e3fe4fcb0896f246d37a3d7e9d6e76d361aa" exitCode=0 Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.158395 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerDied","Data":"c2612cf8e6ff562933bd51a047a8e3fe4fcb0896f246d37a3d7e9d6e76d361aa"} Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.170599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1411fb49-45a2-4434-a9fc-1f01ede2ad00-config\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.170704 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1411fb49-45a2-4434-a9fc-1f01ede2ad00-client-ca\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.170817 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsld9\" (UniqueName: \"kubernetes.io/projected/1411fb49-45a2-4434-a9fc-1f01ede2ad00-kube-api-access-tsld9\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.170845 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1411fb49-45a2-4434-a9fc-1f01ede2ad00-serving-cert\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.170870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1411fb49-45a2-4434-a9fc-1f01ede2ad00-proxy-ca-bundles\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.272118 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1411fb49-45a2-4434-a9fc-1f01ede2ad00-config\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.272185 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1411fb49-45a2-4434-a9fc-1f01ede2ad00-client-ca\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.272226 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsld9\" (UniqueName: \"kubernetes.io/projected/1411fb49-45a2-4434-a9fc-1f01ede2ad00-kube-api-access-tsld9\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.272245 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1411fb49-45a2-4434-a9fc-1f01ede2ad00-serving-cert\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.272262 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1411fb49-45a2-4434-a9fc-1f01ede2ad00-proxy-ca-bundles\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.273063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1411fb49-45a2-4434-a9fc-1f01ede2ad00-client-ca\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.273295 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1411fb49-45a2-4434-a9fc-1f01ede2ad00-proxy-ca-bundles\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.273417 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1411fb49-45a2-4434-a9fc-1f01ede2ad00-config\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.279082 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1411fb49-45a2-4434-a9fc-1f01ede2ad00-serving-cert\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.299477 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsld9\" (UniqueName: \"kubernetes.io/projected/1411fb49-45a2-4434-a9fc-1f01ede2ad00-kube-api-access-tsld9\") pod \"controller-manager-74d5cbbc7d-s75nz\" (UID: \"1411fb49-45a2-4434-a9fc-1f01ede2ad00\") " pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.443128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:45 crc kubenswrapper[4773]: I1012 20:36:45.795485 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz"] Oct 12 20:36:45 crc kubenswrapper[4773]: W1012 20:36:45.815848 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1411fb49_45a2_4434_a9fc_1f01ede2ad00.slice/crio-0edc3f7840c5a624555c6ed853afffb3580868f55d5456a5919222cfb9819ee3 WatchSource:0}: Error finding container 0edc3f7840c5a624555c6ed853afffb3580868f55d5456a5919222cfb9819ee3: Status 404 returned error can't find the container with id 0edc3f7840c5a624555c6ed853afffb3580868f55d5456a5919222cfb9819ee3 Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.167691 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerStarted","Data":"64f8355b613ddbd56f0efdf9ed3cabb3d010d385487686d9d41017526ddea65f"} Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.167773 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerStarted","Data":"4bde099bdb39e2906ae607b43eb4a8bd50af3fba4cc22d14e5665a571f4cdfd5"} Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.167789 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerStarted","Data":"458e09d77c47e0da694da9fca7242f31ddfef4ea62682692ce44f2cdd3fde0ab"} Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.167799 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerStarted","Data":"5e6d1fc40772a0f8d7e6aa748ec47a4255f6367dba5e04d8711b5bcb4f79c0ae"} Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.167809 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerStarted","Data":"e66a6cb9da5b348ba8eb1d2780b0d3fabb0bc34325eac07fe42d670980c02932"} Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.170365 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" event={"ID":"1411fb49-45a2-4434-a9fc-1f01ede2ad00","Type":"ContainerStarted","Data":"e60dbf3dca2862bae348265423d64cb91e9464e393369bbca966caf59e9931a5"} Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.170408 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" event={"ID":"1411fb49-45a2-4434-a9fc-1f01ede2ad00","Type":"ContainerStarted","Data":"0edc3f7840c5a624555c6ed853afffb3580868f55d5456a5919222cfb9819ee3"} Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.170509 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.193524 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" podStartSLOduration=5.193509463 podStartE2EDuration="5.193509463s" podCreationTimestamp="2025-10-12 20:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:36:46.188764181 +0000 UTC m=+754.425062741" watchObservedRunningTime="2025-10-12 20:36:46.193509463 +0000 UTC m=+754.429808023" Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.212575 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74d5cbbc7d-s75nz" Oct 12 20:36:46 crc kubenswrapper[4773]: I1012 20:36:46.704190 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-df4jg" Oct 12 20:36:47 crc kubenswrapper[4773]: I1012 20:36:47.181325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-clw9f" event={"ID":"d895af47-6572-42a4-805b-56be09e5e40c","Type":"ContainerStarted","Data":"9b254af42ff0018583399611a5b849608b3264edf18cbbb83842017966a4247f"} Oct 12 20:36:47 crc kubenswrapper[4773]: I1012 20:36:47.181800 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:47 crc kubenswrapper[4773]: I1012 20:36:47.203573 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-clw9f" podStartSLOduration=6.90606522 podStartE2EDuration="15.203554604s" podCreationTimestamp="2025-10-12 20:36:32 +0000 UTC" firstStartedPulling="2025-10-12 20:36:33.777956726 +0000 UTC m=+742.014255326" lastFinishedPulling="2025-10-12 20:36:42.07544615 +0000 UTC m=+750.311744710" observedRunningTime="2025-10-12 20:36:47.202293299 +0000 UTC m=+755.438591859" watchObservedRunningTime="2025-10-12 20:36:47.203554604 +0000 UTC m=+755.439853164" Oct 12 20:36:48 crc kubenswrapper[4773]: I1012 20:36:48.603852 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:48 crc kubenswrapper[4773]: I1012 20:36:48.654521 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-clw9f" Oct 12 20:36:49 crc kubenswrapper[4773]: I1012 20:36:49.621685 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-pj86l"] Oct 12 20:36:49 crc kubenswrapper[4773]: I1012 20:36:49.622825 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pj86l" Oct 12 20:36:49 crc kubenswrapper[4773]: I1012 20:36:49.628967 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 12 20:36:49 crc kubenswrapper[4773]: I1012 20:36:49.629442 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 12 20:36:49 crc kubenswrapper[4773]: I1012 20:36:49.642637 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pj86l"] Oct 12 20:36:49 crc kubenswrapper[4773]: I1012 20:36:49.728296 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhnpl\" (UniqueName: \"kubernetes.io/projected/91bcb647-b537-4482-b145-1b5f55d704f9-kube-api-access-vhnpl\") pod \"openstack-operator-index-pj86l\" (UID: \"91bcb647-b537-4482-b145-1b5f55d704f9\") " pod="openstack-operators/openstack-operator-index-pj86l" Oct 12 20:36:49 crc kubenswrapper[4773]: I1012 20:36:49.829974 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhnpl\" (UniqueName: \"kubernetes.io/projected/91bcb647-b537-4482-b145-1b5f55d704f9-kube-api-access-vhnpl\") pod \"openstack-operator-index-pj86l\" (UID: \"91bcb647-b537-4482-b145-1b5f55d704f9\") " pod="openstack-operators/openstack-operator-index-pj86l" Oct 12 20:36:49 crc kubenswrapper[4773]: I1012 20:36:49.847851 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhnpl\" (UniqueName: \"kubernetes.io/projected/91bcb647-b537-4482-b145-1b5f55d704f9-kube-api-access-vhnpl\") pod \"openstack-operator-index-pj86l\" (UID: \"91bcb647-b537-4482-b145-1b5f55d704f9\") " pod="openstack-operators/openstack-operator-index-pj86l" Oct 12 20:36:50 crc kubenswrapper[4773]: I1012 20:36:50.006326 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pj86l" Oct 12 20:36:50 crc kubenswrapper[4773]: I1012 20:36:50.407008 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pj86l"] Oct 12 20:36:50 crc kubenswrapper[4773]: W1012 20:36:50.413367 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91bcb647_b537_4482_b145_1b5f55d704f9.slice/crio-6cf2e727bc39042cdd2b44da296fb2a298240e4c89a042b4d947949b6ca05367 WatchSource:0}: Error finding container 6cf2e727bc39042cdd2b44da296fb2a298240e4c89a042b4d947949b6ca05367: Status 404 returned error can't find the container with id 6cf2e727bc39042cdd2b44da296fb2a298240e4c89a042b4d947949b6ca05367 Oct 12 20:36:51 crc kubenswrapper[4773]: I1012 20:36:51.208036 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pj86l" event={"ID":"91bcb647-b537-4482-b145-1b5f55d704f9","Type":"ContainerStarted","Data":"6cf2e727bc39042cdd2b44da296fb2a298240e4c89a042b4d947949b6ca05367"} Oct 12 20:36:51 crc kubenswrapper[4773]: I1012 20:36:51.352832 4773 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 12 20:36:52 crc kubenswrapper[4773]: I1012 20:36:52.215488 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pj86l" event={"ID":"91bcb647-b537-4482-b145-1b5f55d704f9","Type":"ContainerStarted","Data":"e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a"} Oct 12 20:36:52 crc kubenswrapper[4773]: I1012 20:36:52.239853 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-pj86l" podStartSLOduration=2.279731862 podStartE2EDuration="3.239823055s" podCreationTimestamp="2025-10-12 20:36:49 +0000 UTC" firstStartedPulling="2025-10-12 20:36:50.41636872 +0000 UTC m=+758.652667280" lastFinishedPulling="2025-10-12 20:36:51.376459903 +0000 UTC m=+759.612758473" observedRunningTime="2025-10-12 20:36:52.238934251 +0000 UTC m=+760.475232821" watchObservedRunningTime="2025-10-12 20:36:52.239823055 +0000 UTC m=+760.476121655" Oct 12 20:36:52 crc kubenswrapper[4773]: I1012 20:36:52.987149 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-pj86l"] Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.607036 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5p2sl"] Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.608327 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5p2sl" Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.614886 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-sq7f5" Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.617134 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-htq7t" Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.633827 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5p2sl"] Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.727753 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-mcbq6" Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.792011 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b4t2\" (UniqueName: \"kubernetes.io/projected/237d38ea-2958-4510-a3e3-20b37bf0814d-kube-api-access-6b4t2\") pod \"openstack-operator-index-5p2sl\" (UID: \"237d38ea-2958-4510-a3e3-20b37bf0814d\") " pod="openstack-operators/openstack-operator-index-5p2sl" Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.894231 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b4t2\" (UniqueName: \"kubernetes.io/projected/237d38ea-2958-4510-a3e3-20b37bf0814d-kube-api-access-6b4t2\") pod \"openstack-operator-index-5p2sl\" (UID: \"237d38ea-2958-4510-a3e3-20b37bf0814d\") " pod="openstack-operators/openstack-operator-index-5p2sl" Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.921478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b4t2\" (UniqueName: \"kubernetes.io/projected/237d38ea-2958-4510-a3e3-20b37bf0814d-kube-api-access-6b4t2\") pod \"openstack-operator-index-5p2sl\" (UID: \"237d38ea-2958-4510-a3e3-20b37bf0814d\") " pod="openstack-operators/openstack-operator-index-5p2sl" Oct 12 20:36:53 crc kubenswrapper[4773]: I1012 20:36:53.942111 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5p2sl" Oct 12 20:36:54 crc kubenswrapper[4773]: I1012 20:36:54.228682 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-pj86l" podUID="91bcb647-b537-4482-b145-1b5f55d704f9" containerName="registry-server" containerID="cri-o://e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a" gracePeriod=2 Oct 12 20:36:54 crc kubenswrapper[4773]: I1012 20:36:54.366983 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5p2sl"] Oct 12 20:36:54 crc kubenswrapper[4773]: W1012 20:36:54.371922 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod237d38ea_2958_4510_a3e3_20b37bf0814d.slice/crio-86025b981ce5b1b6eb08c9266c7f50402496ee753e41b212736a8a99a6d4ab33 WatchSource:0}: Error finding container 86025b981ce5b1b6eb08c9266c7f50402496ee753e41b212736a8a99a6d4ab33: Status 404 returned error can't find the container with id 86025b981ce5b1b6eb08c9266c7f50402496ee753e41b212736a8a99a6d4ab33 Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.052227 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pj86l" Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.109344 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhnpl\" (UniqueName: \"kubernetes.io/projected/91bcb647-b537-4482-b145-1b5f55d704f9-kube-api-access-vhnpl\") pod \"91bcb647-b537-4482-b145-1b5f55d704f9\" (UID: \"91bcb647-b537-4482-b145-1b5f55d704f9\") " Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.115362 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91bcb647-b537-4482-b145-1b5f55d704f9-kube-api-access-vhnpl" (OuterVolumeSpecName: "kube-api-access-vhnpl") pod "91bcb647-b537-4482-b145-1b5f55d704f9" (UID: "91bcb647-b537-4482-b145-1b5f55d704f9"). InnerVolumeSpecName "kube-api-access-vhnpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.211073 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhnpl\" (UniqueName: \"kubernetes.io/projected/91bcb647-b537-4482-b145-1b5f55d704f9-kube-api-access-vhnpl\") on node \"crc\" DevicePath \"\"" Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.238956 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5p2sl" event={"ID":"237d38ea-2958-4510-a3e3-20b37bf0814d","Type":"ContainerStarted","Data":"d2f75062e286ca5d97d55c5910cdfc776a471f447a70df38d66e58313b807258"} Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.239012 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5p2sl" event={"ID":"237d38ea-2958-4510-a3e3-20b37bf0814d","Type":"ContainerStarted","Data":"86025b981ce5b1b6eb08c9266c7f50402496ee753e41b212736a8a99a6d4ab33"} Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.240531 4773 generic.go:334] "Generic (PLEG): container finished" podID="91bcb647-b537-4482-b145-1b5f55d704f9" containerID="e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a" exitCode=0 Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.240580 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pj86l" event={"ID":"91bcb647-b537-4482-b145-1b5f55d704f9","Type":"ContainerDied","Data":"e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a"} Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.240629 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pj86l" event={"ID":"91bcb647-b537-4482-b145-1b5f55d704f9","Type":"ContainerDied","Data":"6cf2e727bc39042cdd2b44da296fb2a298240e4c89a042b4d947949b6ca05367"} Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.240633 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pj86l" Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.240647 4773 scope.go:117] "RemoveContainer" containerID="e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a" Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.257299 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5p2sl" podStartSLOduration=1.82050895 podStartE2EDuration="2.257274638s" podCreationTimestamp="2025-10-12 20:36:53 +0000 UTC" firstStartedPulling="2025-10-12 20:36:54.375707739 +0000 UTC m=+762.612006299" lastFinishedPulling="2025-10-12 20:36:54.812473387 +0000 UTC m=+763.048771987" observedRunningTime="2025-10-12 20:36:55.255481648 +0000 UTC m=+763.491780248" watchObservedRunningTime="2025-10-12 20:36:55.257274638 +0000 UTC m=+763.493573228" Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.271042 4773 scope.go:117] "RemoveContainer" containerID="e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a" Oct 12 20:36:55 crc kubenswrapper[4773]: E1012 20:36:55.271469 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a\": container with ID starting with e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a not found: ID does not exist" containerID="e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a" Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.271505 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a"} err="failed to get container status \"e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a\": rpc error: code = NotFound desc = could not find container \"e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a\": container with ID starting with e4110c9b7e10e961c046b26759f5d330ce3593bf3ee524340bfeb4d27448f20a not found: ID does not exist" Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.283235 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-pj86l"] Oct 12 20:36:55 crc kubenswrapper[4773]: I1012 20:36:55.287339 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-pj86l"] Oct 12 20:36:56 crc kubenswrapper[4773]: I1012 20:36:56.490200 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91bcb647-b537-4482-b145-1b5f55d704f9" path="/var/lib/kubelet/pods/91bcb647-b537-4482-b145-1b5f55d704f9/volumes" Oct 12 20:36:58 crc kubenswrapper[4773]: I1012 20:36:58.669233 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:36:58 crc kubenswrapper[4773]: I1012 20:36:58.669315 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:36:58 crc kubenswrapper[4773]: I1012 20:36:58.669373 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:36:58 crc kubenswrapper[4773]: I1012 20:36:58.670089 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f52dd857ebd7841601e1ebc902a98c37025d34641286d29646b2dbc4969a08aa"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 20:36:58 crc kubenswrapper[4773]: I1012 20:36:58.670192 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://f52dd857ebd7841601e1ebc902a98c37025d34641286d29646b2dbc4969a08aa" gracePeriod=600 Oct 12 20:36:59 crc kubenswrapper[4773]: I1012 20:36:59.269302 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="f52dd857ebd7841601e1ebc902a98c37025d34641286d29646b2dbc4969a08aa" exitCode=0 Oct 12 20:36:59 crc kubenswrapper[4773]: I1012 20:36:59.269540 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"f52dd857ebd7841601e1ebc902a98c37025d34641286d29646b2dbc4969a08aa"} Oct 12 20:36:59 crc kubenswrapper[4773]: I1012 20:36:59.269564 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"eac722170f5344e043159ef0831f8b64693997069824d20f87b36a000f16f635"} Oct 12 20:36:59 crc kubenswrapper[4773]: I1012 20:36:59.269580 4773 scope.go:117] "RemoveContainer" containerID="50cf730b0a664aa6273b5384482669e8042ad9c84abc280e1e2b88cfe6018b4b" Oct 12 20:37:03 crc kubenswrapper[4773]: I1012 20:37:03.607329 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-clw9f" Oct 12 20:37:03 crc kubenswrapper[4773]: I1012 20:37:03.944826 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-5p2sl" Oct 12 20:37:03 crc kubenswrapper[4773]: I1012 20:37:03.944885 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-5p2sl" Oct 12 20:37:03 crc kubenswrapper[4773]: I1012 20:37:03.987107 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-5p2sl" Oct 12 20:37:04 crc kubenswrapper[4773]: I1012 20:37:04.337903 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-5p2sl" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.231240 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr"] Oct 12 20:37:05 crc kubenswrapper[4773]: E1012 20:37:05.232645 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bcb647-b537-4482-b145-1b5f55d704f9" containerName="registry-server" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.232827 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bcb647-b537-4482-b145-1b5f55d704f9" containerName="registry-server" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.233187 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="91bcb647-b537-4482-b145-1b5f55d704f9" containerName="registry-server" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.234677 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.237248 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6wf68" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.240136 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr"] Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.357592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqnqw\" (UniqueName: \"kubernetes.io/projected/38982daf-184a-4b70-b9d5-f37c23f908f2-kube-api-access-qqnqw\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.357975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-util\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.358024 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-bundle\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.459768 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqnqw\" (UniqueName: \"kubernetes.io/projected/38982daf-184a-4b70-b9d5-f37c23f908f2-kube-api-access-qqnqw\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.459822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-util\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.459843 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-bundle\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.460238 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-bundle\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.460591 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-util\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.490626 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqnqw\" (UniqueName: \"kubernetes.io/projected/38982daf-184a-4b70-b9d5-f37c23f908f2-kube-api-access-qqnqw\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.554842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:05 crc kubenswrapper[4773]: I1012 20:37:05.999244 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr"] Oct 12 20:37:06 crc kubenswrapper[4773]: W1012 20:37:06.008011 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38982daf_184a_4b70_b9d5_f37c23f908f2.slice/crio-74c290eedc01c2e34c7a2f5b2f93c1e5da4f42bf5c7bbaf66e8d9d530f244ba5 WatchSource:0}: Error finding container 74c290eedc01c2e34c7a2f5b2f93c1e5da4f42bf5c7bbaf66e8d9d530f244ba5: Status 404 returned error can't find the container with id 74c290eedc01c2e34c7a2f5b2f93c1e5da4f42bf5c7bbaf66e8d9d530f244ba5 Oct 12 20:37:06 crc kubenswrapper[4773]: I1012 20:37:06.319854 4773 generic.go:334] "Generic (PLEG): container finished" podID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerID="6d939e12986918b1c5cb31f6fabb53f2245bfb852dcbc1c931435ce0715daa4a" exitCode=0 Oct 12 20:37:06 crc kubenswrapper[4773]: I1012 20:37:06.319898 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" event={"ID":"38982daf-184a-4b70-b9d5-f37c23f908f2","Type":"ContainerDied","Data":"6d939e12986918b1c5cb31f6fabb53f2245bfb852dcbc1c931435ce0715daa4a"} Oct 12 20:37:06 crc kubenswrapper[4773]: I1012 20:37:06.319928 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" event={"ID":"38982daf-184a-4b70-b9d5-f37c23f908f2","Type":"ContainerStarted","Data":"74c290eedc01c2e34c7a2f5b2f93c1e5da4f42bf5c7bbaf66e8d9d530f244ba5"} Oct 12 20:37:07 crc kubenswrapper[4773]: I1012 20:37:07.328852 4773 generic.go:334] "Generic (PLEG): container finished" podID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerID="1bbc4b242045c9087fcc80cfc0bca996593e32d2d26509235edecb5f74225f80" exitCode=0 Oct 12 20:37:07 crc kubenswrapper[4773]: I1012 20:37:07.328905 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" event={"ID":"38982daf-184a-4b70-b9d5-f37c23f908f2","Type":"ContainerDied","Data":"1bbc4b242045c9087fcc80cfc0bca996593e32d2d26509235edecb5f74225f80"} Oct 12 20:37:08 crc kubenswrapper[4773]: I1012 20:37:08.337848 4773 generic.go:334] "Generic (PLEG): container finished" podID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerID="aa2129338689f8030748a00e2a8167b8cd55f1b8b3f7b5c69c86fc09bbc9613c" exitCode=0 Oct 12 20:37:08 crc kubenswrapper[4773]: I1012 20:37:08.337913 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" event={"ID":"38982daf-184a-4b70-b9d5-f37c23f908f2","Type":"ContainerDied","Data":"aa2129338689f8030748a00e2a8167b8cd55f1b8b3f7b5c69c86fc09bbc9613c"} Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.686303 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.813899 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-util\") pod \"38982daf-184a-4b70-b9d5-f37c23f908f2\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.814065 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-bundle\") pod \"38982daf-184a-4b70-b9d5-f37c23f908f2\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.814104 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqnqw\" (UniqueName: \"kubernetes.io/projected/38982daf-184a-4b70-b9d5-f37c23f908f2-kube-api-access-qqnqw\") pod \"38982daf-184a-4b70-b9d5-f37c23f908f2\" (UID: \"38982daf-184a-4b70-b9d5-f37c23f908f2\") " Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.815350 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-bundle" (OuterVolumeSpecName: "bundle") pod "38982daf-184a-4b70-b9d5-f37c23f908f2" (UID: "38982daf-184a-4b70-b9d5-f37c23f908f2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.822907 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38982daf-184a-4b70-b9d5-f37c23f908f2-kube-api-access-qqnqw" (OuterVolumeSpecName: "kube-api-access-qqnqw") pod "38982daf-184a-4b70-b9d5-f37c23f908f2" (UID: "38982daf-184a-4b70-b9d5-f37c23f908f2"). InnerVolumeSpecName "kube-api-access-qqnqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.827417 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-util" (OuterVolumeSpecName: "util") pod "38982daf-184a-4b70-b9d5-f37c23f908f2" (UID: "38982daf-184a-4b70-b9d5-f37c23f908f2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.915768 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.915812 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqnqw\" (UniqueName: \"kubernetes.io/projected/38982daf-184a-4b70-b9d5-f37c23f908f2-kube-api-access-qqnqw\") on node \"crc\" DevicePath \"\"" Oct 12 20:37:09 crc kubenswrapper[4773]: I1012 20:37:09.915826 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38982daf-184a-4b70-b9d5-f37c23f908f2-util\") on node \"crc\" DevicePath \"\"" Oct 12 20:37:10 crc kubenswrapper[4773]: I1012 20:37:10.358012 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" event={"ID":"38982daf-184a-4b70-b9d5-f37c23f908f2","Type":"ContainerDied","Data":"74c290eedc01c2e34c7a2f5b2f93c1e5da4f42bf5c7bbaf66e8d9d530f244ba5"} Oct 12 20:37:10 crc kubenswrapper[4773]: I1012 20:37:10.358048 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74c290eedc01c2e34c7a2f5b2f93c1e5da4f42bf5c7bbaf66e8d9d530f244ba5" Oct 12 20:37:10 crc kubenswrapper[4773]: I1012 20:37:10.358065 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.747835 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw"] Oct 12 20:37:12 crc kubenswrapper[4773]: E1012 20:37:12.749221 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerName="pull" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.749295 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerName="pull" Oct 12 20:37:12 crc kubenswrapper[4773]: E1012 20:37:12.749351 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerName="util" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.749409 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerName="util" Oct 12 20:37:12 crc kubenswrapper[4773]: E1012 20:37:12.749468 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerName="extract" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.749519 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerName="extract" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.749672 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="38982daf-184a-4b70-b9d5-f37c23f908f2" containerName="extract" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.750319 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.753626 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-8gq9r" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.787584 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw"] Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.849666 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6g94\" (UniqueName: \"kubernetes.io/projected/96a44ad1-ead6-4c4d-be23-622d643a0bf0-kube-api-access-f6g94\") pod \"openstack-operator-controller-operator-688d597459-qgbcw\" (UID: \"96a44ad1-ead6-4c4d-be23-622d643a0bf0\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.950380 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6g94\" (UniqueName: \"kubernetes.io/projected/96a44ad1-ead6-4c4d-be23-622d643a0bf0-kube-api-access-f6g94\") pod \"openstack-operator-controller-operator-688d597459-qgbcw\" (UID: \"96a44ad1-ead6-4c4d-be23-622d643a0bf0\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" Oct 12 20:37:12 crc kubenswrapper[4773]: I1012 20:37:12.972386 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6g94\" (UniqueName: \"kubernetes.io/projected/96a44ad1-ead6-4c4d-be23-622d643a0bf0-kube-api-access-f6g94\") pod \"openstack-operator-controller-operator-688d597459-qgbcw\" (UID: \"96a44ad1-ead6-4c4d-be23-622d643a0bf0\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" Oct 12 20:37:13 crc kubenswrapper[4773]: I1012 20:37:13.062343 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" Oct 12 20:37:13 crc kubenswrapper[4773]: I1012 20:37:13.554369 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw"] Oct 12 20:37:13 crc kubenswrapper[4773]: W1012 20:37:13.557701 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a44ad1_ead6_4c4d_be23_622d643a0bf0.slice/crio-caa9fd9a771a2ee68d3480625047b94cf810579128eae81b5b183ae50a6ca5e1 WatchSource:0}: Error finding container caa9fd9a771a2ee68d3480625047b94cf810579128eae81b5b183ae50a6ca5e1: Status 404 returned error can't find the container with id caa9fd9a771a2ee68d3480625047b94cf810579128eae81b5b183ae50a6ca5e1 Oct 12 20:37:14 crc kubenswrapper[4773]: I1012 20:37:14.387055 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" event={"ID":"96a44ad1-ead6-4c4d-be23-622d643a0bf0","Type":"ContainerStarted","Data":"caa9fd9a771a2ee68d3480625047b94cf810579128eae81b5b183ae50a6ca5e1"} Oct 12 20:37:18 crc kubenswrapper[4773]: I1012 20:37:18.414981 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" event={"ID":"96a44ad1-ead6-4c4d-be23-622d643a0bf0","Type":"ContainerStarted","Data":"643258fe7bac9e9f5a60e66b99488a6fdcdcd5dd63f80d45000cb890869524f9"} Oct 12 20:37:18 crc kubenswrapper[4773]: I1012 20:37:18.812418 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8nz54"] Oct 12 20:37:18 crc kubenswrapper[4773]: I1012 20:37:18.813563 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:18 crc kubenswrapper[4773]: I1012 20:37:18.819584 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nz54"] Oct 12 20:37:18 crc kubenswrapper[4773]: I1012 20:37:18.943598 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ndzg\" (UniqueName: \"kubernetes.io/projected/09e01129-eb09-4a3d-ade6-1df195f4b9a8-kube-api-access-2ndzg\") pod \"community-operators-8nz54\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:18 crc kubenswrapper[4773]: I1012 20:37:18.943651 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-utilities\") pod \"community-operators-8nz54\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:18 crc kubenswrapper[4773]: I1012 20:37:18.943677 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-catalog-content\") pod \"community-operators-8nz54\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:19 crc kubenswrapper[4773]: I1012 20:37:19.048364 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-utilities\") pod \"community-operators-8nz54\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:19 crc kubenswrapper[4773]: I1012 20:37:19.048408 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-catalog-content\") pod \"community-operators-8nz54\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:19 crc kubenswrapper[4773]: I1012 20:37:19.048880 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-utilities\") pod \"community-operators-8nz54\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:19 crc kubenswrapper[4773]: I1012 20:37:19.048976 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-catalog-content\") pod \"community-operators-8nz54\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:19 crc kubenswrapper[4773]: I1012 20:37:19.049003 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ndzg\" (UniqueName: \"kubernetes.io/projected/09e01129-eb09-4a3d-ade6-1df195f4b9a8-kube-api-access-2ndzg\") pod \"community-operators-8nz54\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:19 crc kubenswrapper[4773]: I1012 20:37:19.084986 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ndzg\" (UniqueName: \"kubernetes.io/projected/09e01129-eb09-4a3d-ade6-1df195f4b9a8-kube-api-access-2ndzg\") pod \"community-operators-8nz54\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:19 crc kubenswrapper[4773]: I1012 20:37:19.136012 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:19 crc kubenswrapper[4773]: I1012 20:37:19.992829 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nz54"] Oct 12 20:37:20 crc kubenswrapper[4773]: W1012 20:37:20.493535 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09e01129_eb09_4a3d_ade6_1df195f4b9a8.slice/crio-9e025186ca3fe429b9fa643f7b5397c39bc3f17b342f14c194202f18a49874e4 WatchSource:0}: Error finding container 9e025186ca3fe429b9fa643f7b5397c39bc3f17b342f14c194202f18a49874e4: Status 404 returned error can't find the container with id 9e025186ca3fe429b9fa643f7b5397c39bc3f17b342f14c194202f18a49874e4 Oct 12 20:37:21 crc kubenswrapper[4773]: I1012 20:37:21.436110 4773 generic.go:334] "Generic (PLEG): container finished" podID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerID="a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013" exitCode=0 Oct 12 20:37:21 crc kubenswrapper[4773]: I1012 20:37:21.436211 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nz54" event={"ID":"09e01129-eb09-4a3d-ade6-1df195f4b9a8","Type":"ContainerDied","Data":"a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013"} Oct 12 20:37:21 crc kubenswrapper[4773]: I1012 20:37:21.436281 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nz54" event={"ID":"09e01129-eb09-4a3d-ade6-1df195f4b9a8","Type":"ContainerStarted","Data":"9e025186ca3fe429b9fa643f7b5397c39bc3f17b342f14c194202f18a49874e4"} Oct 12 20:37:21 crc kubenswrapper[4773]: I1012 20:37:21.437811 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" event={"ID":"96a44ad1-ead6-4c4d-be23-622d643a0bf0","Type":"ContainerStarted","Data":"1f4cfe3315d4ee6d6e1b89d4ee5349751c5284014767190731f1dbf9f19a0500"} Oct 12 20:37:21 crc kubenswrapper[4773]: I1012 20:37:21.438022 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" Oct 12 20:37:21 crc kubenswrapper[4773]: I1012 20:37:21.491154 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" podStartSLOduration=2.230025804 podStartE2EDuration="9.491133203s" podCreationTimestamp="2025-10-12 20:37:12 +0000 UTC" firstStartedPulling="2025-10-12 20:37:13.559849625 +0000 UTC m=+781.796148185" lastFinishedPulling="2025-10-12 20:37:20.820957024 +0000 UTC m=+789.057255584" observedRunningTime="2025-10-12 20:37:21.480910349 +0000 UTC m=+789.717208929" watchObservedRunningTime="2025-10-12 20:37:21.491133203 +0000 UTC m=+789.727431763" Oct 12 20:37:22 crc kubenswrapper[4773]: I1012 20:37:22.471184 4773 generic.go:334] "Generic (PLEG): container finished" podID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerID="9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1" exitCode=0 Oct 12 20:37:22 crc kubenswrapper[4773]: I1012 20:37:22.471478 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nz54" event={"ID":"09e01129-eb09-4a3d-ade6-1df195f4b9a8","Type":"ContainerDied","Data":"9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1"} Oct 12 20:37:22 crc kubenswrapper[4773]: I1012 20:37:22.475349 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-688d597459-qgbcw" Oct 12 20:37:23 crc kubenswrapper[4773]: I1012 20:37:23.481674 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nz54" event={"ID":"09e01129-eb09-4a3d-ade6-1df195f4b9a8","Type":"ContainerStarted","Data":"3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb"} Oct 12 20:37:23 crc kubenswrapper[4773]: I1012 20:37:23.501904 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8nz54" podStartSLOduration=3.953881543 podStartE2EDuration="5.501882086s" podCreationTimestamp="2025-10-12 20:37:18 +0000 UTC" firstStartedPulling="2025-10-12 20:37:21.438865249 +0000 UTC m=+789.675163809" lastFinishedPulling="2025-10-12 20:37:22.986865792 +0000 UTC m=+791.223164352" observedRunningTime="2025-10-12 20:37:23.500697683 +0000 UTC m=+791.736996243" watchObservedRunningTime="2025-10-12 20:37:23.501882086 +0000 UTC m=+791.738180656" Oct 12 20:37:29 crc kubenswrapper[4773]: I1012 20:37:29.136435 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:29 crc kubenswrapper[4773]: I1012 20:37:29.137955 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:29 crc kubenswrapper[4773]: I1012 20:37:29.242310 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:29 crc kubenswrapper[4773]: I1012 20:37:29.588008 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:30 crc kubenswrapper[4773]: I1012 20:37:30.196913 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nz54"] Oct 12 20:37:31 crc kubenswrapper[4773]: I1012 20:37:31.533586 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8nz54" podUID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerName="registry-server" containerID="cri-o://3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb" gracePeriod=2 Oct 12 20:37:31 crc kubenswrapper[4773]: I1012 20:37:31.961235 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.010442 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-utilities\") pod \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.010540 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ndzg\" (UniqueName: \"kubernetes.io/projected/09e01129-eb09-4a3d-ade6-1df195f4b9a8-kube-api-access-2ndzg\") pod \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.010573 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-catalog-content\") pod \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\" (UID: \"09e01129-eb09-4a3d-ade6-1df195f4b9a8\") " Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.011538 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-utilities" (OuterVolumeSpecName: "utilities") pod "09e01129-eb09-4a3d-ade6-1df195f4b9a8" (UID: "09e01129-eb09-4a3d-ade6-1df195f4b9a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.016395 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09e01129-eb09-4a3d-ade6-1df195f4b9a8-kube-api-access-2ndzg" (OuterVolumeSpecName: "kube-api-access-2ndzg") pod "09e01129-eb09-4a3d-ade6-1df195f4b9a8" (UID: "09e01129-eb09-4a3d-ade6-1df195f4b9a8"). InnerVolumeSpecName "kube-api-access-2ndzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.065993 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09e01129-eb09-4a3d-ade6-1df195f4b9a8" (UID: "09e01129-eb09-4a3d-ade6-1df195f4b9a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.112037 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.112267 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ndzg\" (UniqueName: \"kubernetes.io/projected/09e01129-eb09-4a3d-ade6-1df195f4b9a8-kube-api-access-2ndzg\") on node \"crc\" DevicePath \"\"" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.112361 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e01129-eb09-4a3d-ade6-1df195f4b9a8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.540526 4773 generic.go:334] "Generic (PLEG): container finished" podID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerID="3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb" exitCode=0 Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.540698 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nz54" event={"ID":"09e01129-eb09-4a3d-ade6-1df195f4b9a8","Type":"ContainerDied","Data":"3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb"} Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.540913 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nz54" event={"ID":"09e01129-eb09-4a3d-ade6-1df195f4b9a8","Type":"ContainerDied","Data":"9e025186ca3fe429b9fa643f7b5397c39bc3f17b342f14c194202f18a49874e4"} Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.540779 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nz54" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.541646 4773 scope.go:117] "RemoveContainer" containerID="3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.563030 4773 scope.go:117] "RemoveContainer" containerID="9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.584780 4773 scope.go:117] "RemoveContainer" containerID="a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.601748 4773 scope.go:117] "RemoveContainer" containerID="3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.602122 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nz54"] Oct 12 20:37:32 crc kubenswrapper[4773]: E1012 20:37:32.602126 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb\": container with ID starting with 3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb not found: ID does not exist" containerID="3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.602217 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb"} err="failed to get container status \"3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb\": rpc error: code = NotFound desc = could not find container \"3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb\": container with ID starting with 3408947a6ec2dd5b3bf6d3596718431628f0bcda0f7362d35e7ef32019267dfb not found: ID does not exist" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.602245 4773 scope.go:117] "RemoveContainer" containerID="9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1" Oct 12 20:37:32 crc kubenswrapper[4773]: E1012 20:37:32.602543 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1\": container with ID starting with 9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1 not found: ID does not exist" containerID="9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.602571 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1"} err="failed to get container status \"9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1\": rpc error: code = NotFound desc = could not find container \"9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1\": container with ID starting with 9a0636456cc0cf5af433a146ea6d020858068568ce228b4f0dc6eb7ddc8bb7b1 not found: ID does not exist" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.602593 4773 scope.go:117] "RemoveContainer" containerID="a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013" Oct 12 20:37:32 crc kubenswrapper[4773]: E1012 20:37:32.602793 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013\": container with ID starting with a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013 not found: ID does not exist" containerID="a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.602810 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013"} err="failed to get container status \"a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013\": rpc error: code = NotFound desc = could not find container \"a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013\": container with ID starting with a58535459eba2bc4fe585a2d668902c2f51f9b72d5d68e2704a86e239edcd013 not found: ID does not exist" Oct 12 20:37:32 crc kubenswrapper[4773]: I1012 20:37:32.627913 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8nz54"] Oct 12 20:37:34 crc kubenswrapper[4773]: I1012 20:37:34.492028 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" path="/var/lib/kubelet/pods/09e01129-eb09-4a3d-ade6-1df195f4b9a8/volumes" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.193256 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2"] Oct 12 20:37:39 crc kubenswrapper[4773]: E1012 20:37:39.193885 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerName="registry-server" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.193897 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerName="registry-server" Oct 12 20:37:39 crc kubenswrapper[4773]: E1012 20:37:39.193913 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerName="extract-content" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.193920 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerName="extract-content" Oct 12 20:37:39 crc kubenswrapper[4773]: E1012 20:37:39.193934 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerName="extract-utilities" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.193940 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerName="extract-utilities" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.194047 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="09e01129-eb09-4a3d-ade6-1df195f4b9a8" containerName="registry-server" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.194611 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.203086 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vsv98" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.203467 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.204367 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.212291 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.212929 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nvm79" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.239838 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.241127 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.244133 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-66r5s" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.263462 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.269741 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.283230 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.284302 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.288186 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-tsnvl" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.305649 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xjvs\" (UniqueName: \"kubernetes.io/projected/e3a81848-dc85-44b3-addf-35cb34c1e85a-kube-api-access-6xjvs\") pod \"cinder-operator-controller-manager-7b7fb68549-4wwwj\" (UID: \"e3a81848-dc85-44b3-addf-35cb34c1e85a\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.305711 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b2gx\" (UniqueName: \"kubernetes.io/projected/69dd4207-8b02-4a43-bc3a-9c939881422f-kube-api-access-6b2gx\") pod \"designate-operator-controller-manager-85d5d9dd78-sfgw7\" (UID: \"69dd4207-8b02-4a43-bc3a-9c939881422f\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.305757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgfnf\" (UniqueName: \"kubernetes.io/projected/ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a-kube-api-access-tgfnf\") pod \"barbican-operator-controller-manager-658bdf4b74-rqvz2\" (UID: \"ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.306756 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.317047 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.317927 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.325745 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xs9q5" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.380323 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.385736 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.386842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.393289 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nsrsd" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.412782 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.413482 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xjvs\" (UniqueName: \"kubernetes.io/projected/e3a81848-dc85-44b3-addf-35cb34c1e85a-kube-api-access-6xjvs\") pod \"cinder-operator-controller-manager-7b7fb68549-4wwwj\" (UID: \"e3a81848-dc85-44b3-addf-35cb34c1e85a\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.413537 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b2gx\" (UniqueName: \"kubernetes.io/projected/69dd4207-8b02-4a43-bc3a-9c939881422f-kube-api-access-6b2gx\") pod \"designate-operator-controller-manager-85d5d9dd78-sfgw7\" (UID: \"69dd4207-8b02-4a43-bc3a-9c939881422f\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.413579 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgfnf\" (UniqueName: \"kubernetes.io/projected/ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a-kube-api-access-tgfnf\") pod \"barbican-operator-controller-manager-658bdf4b74-rqvz2\" (UID: \"ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.413618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnd8c\" (UniqueName: \"kubernetes.io/projected/e963f42c-7955-4378-927e-1ab264a6116e-kube-api-access-nnd8c\") pod \"heat-operator-controller-manager-858f76bbdd-xnqzt\" (UID: \"e963f42c-7955-4378-927e-1ab264a6116e\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.413644 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrh24\" (UniqueName: \"kubernetes.io/projected/3bff7ce4-adb2-494b-8644-f8e7568efa62-kube-api-access-jrh24\") pod \"glance-operator-controller-manager-84b9b84486-llrqr\" (UID: \"3bff7ce4-adb2-494b-8644-f8e7568efa62\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.418181 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.419391 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.432222 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.432284 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.433271 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.435694 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rwd2b" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.435905 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.445000 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lgvvn" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.457105 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgfnf\" (UniqueName: \"kubernetes.io/projected/ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a-kube-api-access-tgfnf\") pod \"barbican-operator-controller-manager-658bdf4b74-rqvz2\" (UID: \"ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.463627 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b2gx\" (UniqueName: \"kubernetes.io/projected/69dd4207-8b02-4a43-bc3a-9c939881422f-kube-api-access-6b2gx\") pod \"designate-operator-controller-manager-85d5d9dd78-sfgw7\" (UID: \"69dd4207-8b02-4a43-bc3a-9c939881422f\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.473567 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.474773 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.476736 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-gjgnm" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.487492 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xjvs\" (UniqueName: \"kubernetes.io/projected/e3a81848-dc85-44b3-addf-35cb34c1e85a-kube-api-access-6xjvs\") pod \"cinder-operator-controller-manager-7b7fb68549-4wwwj\" (UID: \"e3a81848-dc85-44b3-addf-35cb34c1e85a\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.494667 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.513274 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.515887 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbtzs\" (UniqueName: \"kubernetes.io/projected/9392e042-5a5f-47d2-9232-3fa47cce88f3-kube-api-access-pbtzs\") pod \"horizon-operator-controller-manager-7ffbcb7588-shp22\" (UID: \"9392e042-5a5f-47d2-9232-3fa47cce88f3\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.515929 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw8db\" (UniqueName: \"kubernetes.io/projected/6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17-kube-api-access-lw8db\") pod \"infra-operator-controller-manager-656bcbd775-8j4jx\" (UID: \"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.515970 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17-cert\") pod \"infra-operator-controller-manager-656bcbd775-8j4jx\" (UID: \"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.516014 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnd8c\" (UniqueName: \"kubernetes.io/projected/e963f42c-7955-4378-927e-1ab264a6116e-kube-api-access-nnd8c\") pod \"heat-operator-controller-manager-858f76bbdd-xnqzt\" (UID: \"e963f42c-7955-4378-927e-1ab264a6116e\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.516043 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrh24\" (UniqueName: \"kubernetes.io/projected/3bff7ce4-adb2-494b-8644-f8e7568efa62-kube-api-access-jrh24\") pod \"glance-operator-controller-manager-84b9b84486-llrqr\" (UID: \"3bff7ce4-adb2-494b-8644-f8e7568efa62\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.516114 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqcxm\" (UniqueName: \"kubernetes.io/projected/83700a3c-4ccd-4ac6-8c0a-c530623ffdfe-kube-api-access-jqcxm\") pod \"ironic-operator-controller-manager-9c5c78d49-fqqwc\" (UID: \"83700a3c-4ccd-4ac6-8c0a-c530623ffdfe\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.516141 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4w5x\" (UniqueName: \"kubernetes.io/projected/5321f2fd-a14c-4a48-be68-bdbefe80aa8d-kube-api-access-m4w5x\") pod \"keystone-operator-controller-manager-55b6b7c7b8-sc6z2\" (UID: \"5321f2fd-a14c-4a48-be68-bdbefe80aa8d\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.521343 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.529656 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.548498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrh24\" (UniqueName: \"kubernetes.io/projected/3bff7ce4-adb2-494b-8644-f8e7568efa62-kube-api-access-jrh24\") pod \"glance-operator-controller-manager-84b9b84486-llrqr\" (UID: \"3bff7ce4-adb2-494b-8644-f8e7568efa62\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.548780 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.563137 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.563879 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.566392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnd8c\" (UniqueName: \"kubernetes.io/projected/e963f42c-7955-4378-927e-1ab264a6116e-kube-api-access-nnd8c\") pod \"heat-operator-controller-manager-858f76bbdd-xnqzt\" (UID: \"e963f42c-7955-4378-927e-1ab264a6116e\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.567608 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-26x5z" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.600423 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.601300 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.601388 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.613036 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.613667 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qjkc2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.621973 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v6z9\" (UniqueName: \"kubernetes.io/projected/2e680c12-2026-4296-8ffa-d0185c12d2c1-kube-api-access-6v6z9\") pod \"manila-operator-controller-manager-5f67fbc655-jh557\" (UID: \"2e680c12-2026-4296-8ffa-d0185c12d2c1\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.622035 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqcxm\" (UniqueName: \"kubernetes.io/projected/83700a3c-4ccd-4ac6-8c0a-c530623ffdfe-kube-api-access-jqcxm\") pod \"ironic-operator-controller-manager-9c5c78d49-fqqwc\" (UID: \"83700a3c-4ccd-4ac6-8c0a-c530623ffdfe\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.622062 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4w5x\" (UniqueName: \"kubernetes.io/projected/5321f2fd-a14c-4a48-be68-bdbefe80aa8d-kube-api-access-m4w5x\") pod \"keystone-operator-controller-manager-55b6b7c7b8-sc6z2\" (UID: \"5321f2fd-a14c-4a48-be68-bdbefe80aa8d\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.622106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbtzs\" (UniqueName: \"kubernetes.io/projected/9392e042-5a5f-47d2-9232-3fa47cce88f3-kube-api-access-pbtzs\") pod \"horizon-operator-controller-manager-7ffbcb7588-shp22\" (UID: \"9392e042-5a5f-47d2-9232-3fa47cce88f3\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.622125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw8db\" (UniqueName: \"kubernetes.io/projected/6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17-kube-api-access-lw8db\") pod \"infra-operator-controller-manager-656bcbd775-8j4jx\" (UID: \"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.622165 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17-cert\") pod \"infra-operator-controller-manager-656bcbd775-8j4jx\" (UID: \"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:37:39 crc kubenswrapper[4773]: E1012 20:37:39.622273 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 12 20:37:39 crc kubenswrapper[4773]: E1012 20:37:39.622316 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17-cert podName:6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17 nodeName:}" failed. No retries permitted until 2025-10-12 20:37:40.122301036 +0000 UTC m=+808.358599596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17-cert") pod "infra-operator-controller-manager-656bcbd775-8j4jx" (UID: "6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17") : secret "infra-operator-webhook-server-cert" not found Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.645022 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.672337 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqcxm\" (UniqueName: \"kubernetes.io/projected/83700a3c-4ccd-4ac6-8c0a-c530623ffdfe-kube-api-access-jqcxm\") pod \"ironic-operator-controller-manager-9c5c78d49-fqqwc\" (UID: \"83700a3c-4ccd-4ac6-8c0a-c530623ffdfe\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.681134 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.682327 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.690289 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbtzs\" (UniqueName: \"kubernetes.io/projected/9392e042-5a5f-47d2-9232-3fa47cce88f3-kube-api-access-pbtzs\") pod \"horizon-operator-controller-manager-7ffbcb7588-shp22\" (UID: \"9392e042-5a5f-47d2-9232-3fa47cce88f3\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.703810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4w5x\" (UniqueName: \"kubernetes.io/projected/5321f2fd-a14c-4a48-be68-bdbefe80aa8d-kube-api-access-m4w5x\") pod \"keystone-operator-controller-manager-55b6b7c7b8-sc6z2\" (UID: \"5321f2fd-a14c-4a48-be68-bdbefe80aa8d\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.706087 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jk4p6" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.706441 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.726338 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw8db\" (UniqueName: \"kubernetes.io/projected/6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17-kube-api-access-lw8db\") pod \"infra-operator-controller-manager-656bcbd775-8j4jx\" (UID: \"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.730990 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wz9\" (UniqueName: \"kubernetes.io/projected/843b5d05-f35d-4632-8781-4c60ed803cb6-kube-api-access-72wz9\") pod \"mariadb-operator-controller-manager-f9fb45f8f-dtppq\" (UID: \"843b5d05-f35d-4632-8781-4c60ed803cb6\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.731030 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v6z9\" (UniqueName: \"kubernetes.io/projected/2e680c12-2026-4296-8ffa-d0185c12d2c1-kube-api-access-6v6z9\") pod \"manila-operator-controller-manager-5f67fbc655-jh557\" (UID: \"2e680c12-2026-4296-8ffa-d0185c12d2c1\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.731068 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgqcm\" (UniqueName: \"kubernetes.io/projected/f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b-kube-api-access-tgqcm\") pod \"nova-operator-controller-manager-5df598886f-pbmbc\" (UID: \"f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.745032 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.745087 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.781268 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht"] Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.782351 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.828594 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.893291 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wz9\" (UniqueName: \"kubernetes.io/projected/843b5d05-f35d-4632-8781-4c60ed803cb6-kube-api-access-72wz9\") pod \"mariadb-operator-controller-manager-f9fb45f8f-dtppq\" (UID: \"843b5d05-f35d-4632-8781-4c60ed803cb6\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.893512 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgqcm\" (UniqueName: \"kubernetes.io/projected/f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b-kube-api-access-tgqcm\") pod \"nova-operator-controller-manager-5df598886f-pbmbc\" (UID: \"f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.901692 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xwl4b" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.944510 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v6z9\" (UniqueName: \"kubernetes.io/projected/2e680c12-2026-4296-8ffa-d0185c12d2c1-kube-api-access-6v6z9\") pod \"manila-operator-controller-manager-5f67fbc655-jh557\" (UID: \"2e680c12-2026-4296-8ffa-d0185c12d2c1\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.983632 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.984929 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" Oct 12 20:37:39 crc kubenswrapper[4773]: I1012 20:37:39.998430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69jn\" (UniqueName: \"kubernetes.io/projected/7dc3b970-233d-4af3-a341-8297af5433bc-kube-api-access-z69jn\") pod \"neutron-operator-controller-manager-79d585cb66-b6kht\" (UID: \"7dc3b970-233d-4af3-a341-8297af5433bc\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.002430 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.003053 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgqcm\" (UniqueName: \"kubernetes.io/projected/f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b-kube-api-access-tgqcm\") pod \"nova-operator-controller-manager-5df598886f-pbmbc\" (UID: \"f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.003205 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wz9\" (UniqueName: \"kubernetes.io/projected/843b5d05-f35d-4632-8781-4c60ed803cb6-kube-api-access-72wz9\") pod \"mariadb-operator-controller-manager-f9fb45f8f-dtppq\" (UID: \"843b5d05-f35d-4632-8781-4c60ed803cb6\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.027805 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.028982 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.038933 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wkqcp" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.047976 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.070259 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.071687 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.072800 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.079880 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.080200 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4cvc4" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.083062 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.084177 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.094998 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2ncdn" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.095079 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.096190 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.101928 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-k68r4" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.106926 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z69jn\" (UniqueName: \"kubernetes.io/projected/7dc3b970-233d-4af3-a341-8297af5433bc-kube-api-access-z69jn\") pod \"neutron-operator-controller-manager-79d585cb66-b6kht\" (UID: \"7dc3b970-233d-4af3-a341-8297af5433bc\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.107085 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzhh5\" (UniqueName: \"kubernetes.io/projected/095b027c-fb46-4d19-bbcf-84871f8c90f7-kube-api-access-pzhh5\") pod \"octavia-operator-controller-manager-69fdcfc5f5-nvvg8\" (UID: \"095b027c-fb46-4d19-bbcf-84871f8c90f7\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.114914 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.124951 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.149593 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69jn\" (UniqueName: \"kubernetes.io/projected/7dc3b970-233d-4af3-a341-8297af5433bc-kube-api-access-z69jn\") pod \"neutron-operator-controller-manager-79d585cb66-b6kht\" (UID: \"7dc3b970-233d-4af3-a341-8297af5433bc\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.176136 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.182815 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.183438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.183858 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.190163 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-w5s2q" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.210732 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17-cert\") pod \"infra-operator-controller-manager-656bcbd775-8j4jx\" (UID: \"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.210775 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9e5880b-293d-4311-8928-f93649649c93-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf\" (UID: \"b9e5880b-293d-4311-8928-f93649649c93\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.210849 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwpkk\" (UniqueName: \"kubernetes.io/projected/2b083bd3-8fe4-44c8-8d3e-f736260b8210-kube-api-access-nwpkk\") pod \"placement-operator-controller-manager-68b6c87b68-thj2w\" (UID: \"2b083bd3-8fe4-44c8-8d3e-f736260b8210\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.210872 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzhh5\" (UniqueName: \"kubernetes.io/projected/095b027c-fb46-4d19-bbcf-84871f8c90f7-kube-api-access-pzhh5\") pod \"octavia-operator-controller-manager-69fdcfc5f5-nvvg8\" (UID: \"095b027c-fb46-4d19-bbcf-84871f8c90f7\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.210894 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9vmv\" (UniqueName: \"kubernetes.io/projected/b9e5880b-293d-4311-8928-f93649649c93-kube-api-access-n9vmv\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf\" (UID: \"b9e5880b-293d-4311-8928-f93649649c93\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.210929 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jgdn\" (UniqueName: \"kubernetes.io/projected/34c81f6e-1829-4f0f-a0aa-951b4d4f41c4-kube-api-access-9jgdn\") pod \"ovn-operator-controller-manager-79df5fb58c-rczp5\" (UID: \"34c81f6e-1829-4f0f-a0aa-951b4d4f41c4\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.213935 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.214920 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.215351 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17-cert\") pod \"infra-operator-controller-manager-656bcbd775-8j4jx\" (UID: \"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.225530 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vjh2x" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.249379 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.251321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzhh5\" (UniqueName: \"kubernetes.io/projected/095b027c-fb46-4d19-bbcf-84871f8c90f7-kube-api-access-pzhh5\") pod \"octavia-operator-controller-manager-69fdcfc5f5-nvvg8\" (UID: \"095b027c-fb46-4d19-bbcf-84871f8c90f7\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.288056 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.303874 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.305191 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.312412 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwpkk\" (UniqueName: \"kubernetes.io/projected/2b083bd3-8fe4-44c8-8d3e-f736260b8210-kube-api-access-nwpkk\") pod \"placement-operator-controller-manager-68b6c87b68-thj2w\" (UID: \"2b083bd3-8fe4-44c8-8d3e-f736260b8210\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.312475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9vmv\" (UniqueName: \"kubernetes.io/projected/b9e5880b-293d-4311-8928-f93649649c93-kube-api-access-n9vmv\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf\" (UID: \"b9e5880b-293d-4311-8928-f93649649c93\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.312505 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7lvr\" (UniqueName: \"kubernetes.io/projected/293153be-33db-41ba-a589-55a17026c756-kube-api-access-v7lvr\") pod \"telemetry-operator-controller-manager-67cfc6749b-bcdc4\" (UID: \"293153be-33db-41ba-a589-55a17026c756\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.312533 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jgdn\" (UniqueName: \"kubernetes.io/projected/34c81f6e-1829-4f0f-a0aa-951b4d4f41c4-kube-api-access-9jgdn\") pod \"ovn-operator-controller-manager-79df5fb58c-rczp5\" (UID: \"34c81f6e-1829-4f0f-a0aa-951b4d4f41c4\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.312569 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9e5880b-293d-4311-8928-f93649649c93-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf\" (UID: \"b9e5880b-293d-4311-8928-f93649649c93\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.312586 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgwwm\" (UniqueName: \"kubernetes.io/projected/b2ec8f8f-d841-4683-86ed-54ec360d9ec1-kube-api-access-cgwwm\") pod \"swift-operator-controller-manager-db6d7f97b-vx9cr\" (UID: \"b2ec8f8f-d841-4683-86ed-54ec360d9ec1\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.312807 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" Oct 12 20:37:40 crc kubenswrapper[4773]: E1012 20:37:40.314080 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 12 20:37:40 crc kubenswrapper[4773]: E1012 20:37:40.314121 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9e5880b-293d-4311-8928-f93649649c93-cert podName:b9e5880b-293d-4311-8928-f93649649c93 nodeName:}" failed. No retries permitted until 2025-10-12 20:37:40.814107336 +0000 UTC m=+809.050405896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9e5880b-293d-4311-8928-f93649649c93-cert") pod "openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" (UID: "b9e5880b-293d-4311-8928-f93649649c93") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.334097 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xsvcs" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.356083 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.379148 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.380194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.385334 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwpkk\" (UniqueName: \"kubernetes.io/projected/2b083bd3-8fe4-44c8-8d3e-f736260b8210-kube-api-access-nwpkk\") pod \"placement-operator-controller-manager-68b6c87b68-thj2w\" (UID: \"2b083bd3-8fe4-44c8-8d3e-f736260b8210\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.385941 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9vmv\" (UniqueName: \"kubernetes.io/projected/b9e5880b-293d-4311-8928-f93649649c93-kube-api-access-n9vmv\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf\" (UID: \"b9e5880b-293d-4311-8928-f93649649c93\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.392293 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jgdn\" (UniqueName: \"kubernetes.io/projected/34c81f6e-1829-4f0f-a0aa-951b4d4f41c4-kube-api-access-9jgdn\") pod \"ovn-operator-controller-manager-79df5fb58c-rczp5\" (UID: \"34c81f6e-1829-4f0f-a0aa-951b4d4f41c4\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.397299 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lxvbl" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.405888 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.417943 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7lvr\" (UniqueName: \"kubernetes.io/projected/293153be-33db-41ba-a589-55a17026c756-kube-api-access-v7lvr\") pod \"telemetry-operator-controller-manager-67cfc6749b-bcdc4\" (UID: \"293153be-33db-41ba-a589-55a17026c756\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.418003 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7z4\" (UniqueName: \"kubernetes.io/projected/e38708a6-e3b7-407d-8fe5-f27cd9a69f76-kube-api-access-6t7z4\") pod \"test-operator-controller-manager-5458f77c4-hz6mw\" (UID: \"e38708a6-e3b7-407d-8fe5-f27cd9a69f76\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.418075 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgwwm\" (UniqueName: \"kubernetes.io/projected/b2ec8f8f-d841-4683-86ed-54ec360d9ec1-kube-api-access-cgwwm\") pod \"swift-operator-controller-manager-db6d7f97b-vx9cr\" (UID: \"b2ec8f8f-d841-4683-86ed-54ec360d9ec1\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.444899 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.452678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgwwm\" (UniqueName: \"kubernetes.io/projected/b2ec8f8f-d841-4683-86ed-54ec360d9ec1-kube-api-access-cgwwm\") pod \"swift-operator-controller-manager-db6d7f97b-vx9cr\" (UID: \"b2ec8f8f-d841-4683-86ed-54ec360d9ec1\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.453007 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.475204 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7lvr\" (UniqueName: \"kubernetes.io/projected/293153be-33db-41ba-a589-55a17026c756-kube-api-access-v7lvr\") pod \"telemetry-operator-controller-manager-67cfc6749b-bcdc4\" (UID: \"293153be-33db-41ba-a589-55a17026c756\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.520055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjdpn\" (UniqueName: \"kubernetes.io/projected/38cef8bd-b25e-47aa-8f3f-9af1289f72f8-kube-api-access-pjdpn\") pod \"watcher-operator-controller-manager-7f554bff7b-4dsfj\" (UID: \"38cef8bd-b25e-47aa-8f3f-9af1289f72f8\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.520103 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7z4\" (UniqueName: \"kubernetes.io/projected/e38708a6-e3b7-407d-8fe5-f27cd9a69f76-kube-api-access-6t7z4\") pod \"test-operator-controller-manager-5458f77c4-hz6mw\" (UID: \"e38708a6-e3b7-407d-8fe5-f27cd9a69f76\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.530185 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.531253 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.532222 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.547410 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.547579 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7rfq7" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.550664 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.598414 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7z4\" (UniqueName: \"kubernetes.io/projected/e38708a6-e3b7-407d-8fe5-f27cd9a69f76-kube-api-access-6t7z4\") pod \"test-operator-controller-manager-5458f77c4-hz6mw\" (UID: \"e38708a6-e3b7-407d-8fe5-f27cd9a69f76\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.610309 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.616857 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.625488 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjn8q\" (UniqueName: \"kubernetes.io/projected/094825fc-aaad-4717-9d34-426f1f3fa63f-kube-api-access-qjn8q\") pod \"openstack-operator-controller-manager-5b95c8954b-xgrzm\" (UID: \"094825fc-aaad-4717-9d34-426f1f3fa63f\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.625550 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjdpn\" (UniqueName: \"kubernetes.io/projected/38cef8bd-b25e-47aa-8f3f-9af1289f72f8-kube-api-access-pjdpn\") pod \"watcher-operator-controller-manager-7f554bff7b-4dsfj\" (UID: \"38cef8bd-b25e-47aa-8f3f-9af1289f72f8\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.625627 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/094825fc-aaad-4717-9d34-426f1f3fa63f-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-xgrzm\" (UID: \"094825fc-aaad-4717-9d34-426f1f3fa63f\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.657873 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.664882 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjdpn\" (UniqueName: \"kubernetes.io/projected/38cef8bd-b25e-47aa-8f3f-9af1289f72f8-kube-api-access-pjdpn\") pod \"watcher-operator-controller-manager-7f554bff7b-4dsfj\" (UID: \"38cef8bd-b25e-47aa-8f3f-9af1289f72f8\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.675080 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.687769 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.688562 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.710176 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gxxlf" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.727663 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/094825fc-aaad-4717-9d34-426f1f3fa63f-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-xgrzm\" (UID: \"094825fc-aaad-4717-9d34-426f1f3fa63f\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.727750 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjn8q\" (UniqueName: \"kubernetes.io/projected/094825fc-aaad-4717-9d34-426f1f3fa63f-kube-api-access-qjn8q\") pod \"openstack-operator-controller-manager-5b95c8954b-xgrzm\" (UID: \"094825fc-aaad-4717-9d34-426f1f3fa63f\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.727962 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" Oct 12 20:37:40 crc kubenswrapper[4773]: E1012 20:37:40.729525 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 12 20:37:40 crc kubenswrapper[4773]: E1012 20:37:40.729664 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/094825fc-aaad-4717-9d34-426f1f3fa63f-cert podName:094825fc-aaad-4717-9d34-426f1f3fa63f nodeName:}" failed. No retries permitted until 2025-10-12 20:37:41.229642154 +0000 UTC m=+809.465940714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/094825fc-aaad-4717-9d34-426f1f3fa63f-cert") pod "openstack-operator-controller-manager-5b95c8954b-xgrzm" (UID: "094825fc-aaad-4717-9d34-426f1f3fa63f") : secret "webhook-server-cert" not found Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.753982 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.793867 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjn8q\" (UniqueName: \"kubernetes.io/projected/094825fc-aaad-4717-9d34-426f1f3fa63f-kube-api-access-qjn8q\") pod \"openstack-operator-controller-manager-5b95c8954b-xgrzm\" (UID: \"094825fc-aaad-4717-9d34-426f1f3fa63f\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.818269 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.829695 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9e5880b-293d-4311-8928-f93649649c93-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf\" (UID: \"b9e5880b-293d-4311-8928-f93649649c93\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.829832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss79r\" (UniqueName: \"kubernetes.io/projected/f7349c73-c56a-4e87-8618-dea521d99b95-kube-api-access-ss79r\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm\" (UID: \"f7349c73-c56a-4e87-8618-dea521d99b95\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.888284 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9e5880b-293d-4311-8928-f93649649c93-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf\" (UID: \"b9e5880b-293d-4311-8928-f93649649c93\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.891571 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.922295 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj"] Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.931015 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss79r\" (UniqueName: \"kubernetes.io/projected/f7349c73-c56a-4e87-8618-dea521d99b95-kube-api-access-ss79r\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm\" (UID: \"f7349c73-c56a-4e87-8618-dea521d99b95\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" Oct 12 20:37:40 crc kubenswrapper[4773]: I1012 20:37:40.979651 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss79r\" (UniqueName: \"kubernetes.io/projected/f7349c73-c56a-4e87-8618-dea521d99b95-kube-api-access-ss79r\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm\" (UID: \"f7349c73-c56a-4e87-8618-dea521d99b95\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.041119 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.112704 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22"] Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.113609 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.237326 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/094825fc-aaad-4717-9d34-426f1f3fa63f-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-xgrzm\" (UID: \"094825fc-aaad-4717-9d34-426f1f3fa63f\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:41 crc kubenswrapper[4773]: E1012 20:37:41.237438 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 12 20:37:41 crc kubenswrapper[4773]: E1012 20:37:41.237505 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/094825fc-aaad-4717-9d34-426f1f3fa63f-cert podName:094825fc-aaad-4717-9d34-426f1f3fa63f nodeName:}" failed. No retries permitted until 2025-10-12 20:37:42.237478238 +0000 UTC m=+810.473776798 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/094825fc-aaad-4717-9d34-426f1f3fa63f-cert") pod "openstack-operator-controller-manager-5b95c8954b-xgrzm" (UID: "094825fc-aaad-4717-9d34-426f1f3fa63f") : secret "webhook-server-cert" not found Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.396551 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt"] Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.444630 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr"] Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.529646 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc"] Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.731501 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" event={"ID":"69dd4207-8b02-4a43-bc3a-9c939881422f","Type":"ContainerStarted","Data":"23e1dbc05e863d4509a46139c2dac702557fff8a2dfb07076f92d6910afc00be"} Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.751558 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" event={"ID":"83700a3c-4ccd-4ac6-8c0a-c530623ffdfe","Type":"ContainerStarted","Data":"bee569d4a4967b219a363484eea94e0a98e4d45915bda1d7ec5132354efad96d"} Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.756611 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" event={"ID":"ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a","Type":"ContainerStarted","Data":"754d20c57a1f2770b326cb785f7b3e5d0db73b626f13f8ca0f1f8750a2cea950"} Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.757580 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557"] Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.759561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" event={"ID":"9392e042-5a5f-47d2-9232-3fa47cce88f3","Type":"ContainerStarted","Data":"66bad0a533b10cd69398763ed91ba9c63c5fb060c7707293de8f968a3a188ca8"} Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.763729 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" event={"ID":"3bff7ce4-adb2-494b-8644-f8e7568efa62","Type":"ContainerStarted","Data":"bd60931658765536e7cdbdb3ad378624a3bc3a687cb0cbe7bb171af788f0ed96"} Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.766749 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2"] Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.771327 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc"] Oct 12 20:37:41 crc kubenswrapper[4773]: W1012 20:37:41.782611 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e680c12_2026_4296_8ffa_d0185c12d2c1.slice/crio-11edf7732be4e6a08625b18c93ba1435fa86cfe68cc62cc6809fe25f57994ace WatchSource:0}: Error finding container 11edf7732be4e6a08625b18c93ba1435fa86cfe68cc62cc6809fe25f57994ace: Status 404 returned error can't find the container with id 11edf7732be4e6a08625b18c93ba1435fa86cfe68cc62cc6809fe25f57994ace Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.783600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" event={"ID":"e963f42c-7955-4378-927e-1ab264a6116e","Type":"ContainerStarted","Data":"26487b5b99ef65dc48ed5b9158df6dba705e11e2411638c5624795d321ed637b"} Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.789321 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" event={"ID":"e3a81848-dc85-44b3-addf-35cb34c1e85a","Type":"ContainerStarted","Data":"b9aa3d10fccc29139d9269c9442a672a21c7ad9953fa6a03a1812e0b3f594338"} Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.915515 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht"] Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.938706 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq"] Oct 12 20:37:41 crc kubenswrapper[4773]: I1012 20:37:41.943137 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8"] Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.034042 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx"] Oct 12 20:37:42 crc kubenswrapper[4773]: W1012 20:37:42.034381 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd8fe9c_3d34_4d22_8bc0_1536aa8b2e17.slice/crio-c7d19a51c95bcd79aaa4522775e2860411755b1a470b0b01f551b90672855efe WatchSource:0}: Error finding container c7d19a51c95bcd79aaa4522775e2860411755b1a470b0b01f551b90672855efe: Status 404 returned error can't find the container with id c7d19a51c95bcd79aaa4522775e2860411755b1a470b0b01f551b90672855efe Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.192836 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw"] Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.198368 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr"] Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.222079 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5"] Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.233463 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm"] Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.243902 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj"] Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.252458 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4"] Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.254254 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9jgdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-79df5fb58c-rczp5_openstack-operators(34c81f6e-1829-4f0f-a0aa-951b4d4f41c4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.259860 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w"] Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.260648 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/094825fc-aaad-4717-9d34-426f1f3fa63f-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-xgrzm\" (UID: \"094825fc-aaad-4717-9d34-426f1f3fa63f\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.281997 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/094825fc-aaad-4717-9d34-426f1f3fa63f-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-xgrzm\" (UID: \"094825fc-aaad-4717-9d34-426f1f3fa63f\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.289019 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf"] Oct 12 20:37:42 crc kubenswrapper[4773]: W1012 20:37:42.309575 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7349c73_c56a_4e87_8618_dea521d99b95.slice/crio-0d9e193b318111347a17ed4b2df44ee7efcb95422e9539fdd192ff92e91992d9 WatchSource:0}: Error finding container 0d9e193b318111347a17ed4b2df44ee7efcb95422e9539fdd192ff92e91992d9: Status 404 returned error can't find the container with id 0d9e193b318111347a17ed4b2df44ee7efcb95422e9539fdd192ff92e91992d9 Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.324134 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ss79r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm_openstack-operators(f7349c73-c56a-4e87-8618-dea521d99b95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.325001 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v7lvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-67cfc6749b-bcdc4_openstack-operators(293153be-33db-41ba-a589-55a17026c756): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.325268 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" podUID="f7349c73-c56a-4e87-8618-dea521d99b95" Oct 12 20:37:42 crc kubenswrapper[4773]: W1012 20:37:42.357205 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b083bd3_8fe4_44c8_8d3e_f736260b8210.slice/crio-9385b30bca29c189ddb740e53af218af9681d0bf6fc75dca66b0011bd390b7ec WatchSource:0}: Error finding container 9385b30bca29c189ddb740e53af218af9681d0bf6fc75dca66b0011bd390b7ec: Status 404 returned error can't find the container with id 9385b30bca29c189ddb740e53af218af9681d0bf6fc75dca66b0011bd390b7ec Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.362247 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pjdpn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f554bff7b-4dsfj_openstack-operators(38cef8bd-b25e-47aa-8f3f-9af1289f72f8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.364454 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nwpkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-68b6c87b68-thj2w_openstack-operators(2b083bd3-8fe4-44c8-8d3e-f736260b8210): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.365266 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.513393 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" podUID="34c81f6e-1829-4f0f-a0aa-951b4d4f41c4" Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.802904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" event={"ID":"7dc3b970-233d-4af3-a341-8297af5433bc","Type":"ContainerStarted","Data":"df852182563ef267dfdc2d907e9429013759c1cb1d2f2ad83841f77111ec3a13"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.815028 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" event={"ID":"f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b","Type":"ContainerStarted","Data":"5f6ef355c4e07c18cc12f8085ab97dbe0da137ec44373c603153566042f81317"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.820631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" event={"ID":"e38708a6-e3b7-407d-8fe5-f27cd9a69f76","Type":"ContainerStarted","Data":"12e017d7dc50ba182ec3edb4a5187e320a34ea3a9e86a3e163243c9476c4927c"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.826660 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" event={"ID":"34c81f6e-1829-4f0f-a0aa-951b4d4f41c4","Type":"ContainerStarted","Data":"a879546e03c87d723f0bea84dee3c8ad15c3c939e095410d7dfe0b38d9a36af6"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.826695 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" event={"ID":"34c81f6e-1829-4f0f-a0aa-951b4d4f41c4","Type":"ContainerStarted","Data":"75abe4d1da886260c663174ae24dc96b7658aaec8b0009a475cdbfc8948a9d78"} Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.829207 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" podUID="34c81f6e-1829-4f0f-a0aa-951b4d4f41c4" Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.843130 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" event={"ID":"2e680c12-2026-4296-8ffa-d0185c12d2c1","Type":"ContainerStarted","Data":"11edf7732be4e6a08625b18c93ba1435fa86cfe68cc62cc6809fe25f57994ace"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.854400 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" event={"ID":"843b5d05-f35d-4632-8781-4c60ed803cb6","Type":"ContainerStarted","Data":"1ba471c3f309e60c3f258c65c3b328c751f4dab6a84db4219135d2702496b9ef"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.857583 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm"] Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.867375 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" podUID="293153be-33db-41ba-a589-55a17026c756" Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.874591 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" event={"ID":"b9e5880b-293d-4311-8928-f93649649c93","Type":"ContainerStarted","Data":"c407f8182c1ab3882b13cb77253f2514aeaf3390c77095f247d8607aa3e7ab4d"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.877556 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" event={"ID":"293153be-33db-41ba-a589-55a17026c756","Type":"ContainerStarted","Data":"6e867fc82d48874c13d0092fe433c4e7ed6e8f42c42735fab6ffc8b391cd68ee"} Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.879803 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" podUID="293153be-33db-41ba-a589-55a17026c756" Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.887254 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" event={"ID":"2b083bd3-8fe4-44c8-8d3e-f736260b8210","Type":"ContainerStarted","Data":"9385b30bca29c189ddb740e53af218af9681d0bf6fc75dca66b0011bd390b7ec"} Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.891794 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" podUID="38cef8bd-b25e-47aa-8f3f-9af1289f72f8" Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.893894 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" event={"ID":"38cef8bd-b25e-47aa-8f3f-9af1289f72f8","Type":"ContainerStarted","Data":"cb95329be1daf702bb09a70ac700c5fdfa2bc10401ae9b255ee86b12aa1b7da3"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.928009 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" event={"ID":"b2ec8f8f-d841-4683-86ed-54ec360d9ec1","Type":"ContainerStarted","Data":"faec42a66af491d84fb01dec15f7b6e14e6b59836938b3f10e2ac023299f01a6"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.933353 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" event={"ID":"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17","Type":"ContainerStarted","Data":"c7d19a51c95bcd79aaa4522775e2860411755b1a470b0b01f551b90672855efe"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.938170 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" event={"ID":"095b027c-fb46-4d19-bbcf-84871f8c90f7","Type":"ContainerStarted","Data":"c96c3fa1d249a74474ebc00fb670bef0d2bfcf8b1e162002b14e1240cbb855a9"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.958479 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" event={"ID":"5321f2fd-a14c-4a48-be68-bdbefe80aa8d","Type":"ContainerStarted","Data":"52629c7d1fa062a615c3932a2a3dad21356770f1c191d450fbefb8bc70bdd351"} Oct 12 20:37:42 crc kubenswrapper[4773]: I1012 20:37:42.959673 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" event={"ID":"f7349c73-c56a-4e87-8618-dea521d99b95","Type":"ContainerStarted","Data":"0d9e193b318111347a17ed4b2df44ee7efcb95422e9539fdd192ff92e91992d9"} Oct 12 20:37:42 crc kubenswrapper[4773]: E1012 20:37:42.964056 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" podUID="f7349c73-c56a-4e87-8618-dea521d99b95" Oct 12 20:37:43 crc kubenswrapper[4773]: E1012 20:37:43.041425 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" podUID="2b083bd3-8fe4-44c8-8d3e-f736260b8210" Oct 12 20:37:44 crc kubenswrapper[4773]: I1012 20:37:44.009270 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" event={"ID":"293153be-33db-41ba-a589-55a17026c756","Type":"ContainerStarted","Data":"ac3a80317bf89a512fb2b7176ca1f0bd4f526c4d0a86d69388ddbed34dc18c83"} Oct 12 20:37:44 crc kubenswrapper[4773]: E1012 20:37:44.015741 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" podUID="293153be-33db-41ba-a589-55a17026c756" Oct 12 20:37:44 crc kubenswrapper[4773]: I1012 20:37:44.018838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" event={"ID":"2b083bd3-8fe4-44c8-8d3e-f736260b8210","Type":"ContainerStarted","Data":"a6e2d4c607194fdbf69be86d2292136a679944d48af2250d33f991a16a84d34e"} Oct 12 20:37:44 crc kubenswrapper[4773]: E1012 20:37:44.024894 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" podUID="2b083bd3-8fe4-44c8-8d3e-f736260b8210" Oct 12 20:37:44 crc kubenswrapper[4773]: I1012 20:37:44.028510 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" event={"ID":"38cef8bd-b25e-47aa-8f3f-9af1289f72f8","Type":"ContainerStarted","Data":"28cc6927acc5b3be3f148d2611105cd3936e0519a5a80f4f4f4472e3bbf7a796"} Oct 12 20:37:44 crc kubenswrapper[4773]: E1012 20:37:44.034613 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" podUID="38cef8bd-b25e-47aa-8f3f-9af1289f72f8" Oct 12 20:37:44 crc kubenswrapper[4773]: I1012 20:37:44.049316 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" event={"ID":"094825fc-aaad-4717-9d34-426f1f3fa63f","Type":"ContainerStarted","Data":"f15ef7d07cbe794c410880a33538f4d132040669bbe66bdfb57e535719ea2e7e"} Oct 12 20:37:44 crc kubenswrapper[4773]: I1012 20:37:44.049362 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" event={"ID":"094825fc-aaad-4717-9d34-426f1f3fa63f","Type":"ContainerStarted","Data":"3e31d6fe36f83b1b5091744815601950090ca12556b1ca0264ea49b4f43f6571"} Oct 12 20:37:44 crc kubenswrapper[4773]: I1012 20:37:44.049373 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" event={"ID":"094825fc-aaad-4717-9d34-426f1f3fa63f","Type":"ContainerStarted","Data":"08bf8c9358424bb6ddd6434983b4bb23e22088f20c591374017e7aa60fac7a18"} Oct 12 20:37:44 crc kubenswrapper[4773]: I1012 20:37:44.049557 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:44 crc kubenswrapper[4773]: E1012 20:37:44.050584 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" podUID="34c81f6e-1829-4f0f-a0aa-951b4d4f41c4" Oct 12 20:37:44 crc kubenswrapper[4773]: E1012 20:37:44.051565 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" podUID="f7349c73-c56a-4e87-8618-dea521d99b95" Oct 12 20:37:44 crc kubenswrapper[4773]: I1012 20:37:44.175220 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" podStartSLOduration=4.175202253 podStartE2EDuration="4.175202253s" podCreationTimestamp="2025-10-12 20:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:37:44.170572174 +0000 UTC m=+812.406870734" watchObservedRunningTime="2025-10-12 20:37:44.175202253 +0000 UTC m=+812.411500813" Oct 12 20:37:45 crc kubenswrapper[4773]: E1012 20:37:45.079939 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" podUID="2b083bd3-8fe4-44c8-8d3e-f736260b8210" Oct 12 20:37:45 crc kubenswrapper[4773]: E1012 20:37:45.080589 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" podUID="38cef8bd-b25e-47aa-8f3f-9af1289f72f8" Oct 12 20:37:45 crc kubenswrapper[4773]: E1012 20:37:45.080633 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" podUID="293153be-33db-41ba-a589-55a17026c756" Oct 12 20:37:52 crc kubenswrapper[4773]: I1012 20:37:52.377673 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-xgrzm" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.170047 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pzqcq"] Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.172922 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.178624 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pzqcq"] Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.280618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p7mv\" (UniqueName: \"kubernetes.io/projected/50e96b5e-1264-4224-917b-dccb3e792b70-kube-api-access-7p7mv\") pod \"redhat-operators-pzqcq\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.280769 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-utilities\") pod \"redhat-operators-pzqcq\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.280794 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-catalog-content\") pod \"redhat-operators-pzqcq\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.381531 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-utilities\") pod \"redhat-operators-pzqcq\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.382244 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-catalog-content\") pod \"redhat-operators-pzqcq\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.382341 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p7mv\" (UniqueName: \"kubernetes.io/projected/50e96b5e-1264-4224-917b-dccb3e792b70-kube-api-access-7p7mv\") pod \"redhat-operators-pzqcq\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.382342 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-utilities\") pod \"redhat-operators-pzqcq\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.382568 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-catalog-content\") pod \"redhat-operators-pzqcq\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.409571 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p7mv\" (UniqueName: \"kubernetes.io/projected/50e96b5e-1264-4224-917b-dccb3e792b70-kube-api-access-7p7mv\") pod \"redhat-operators-pzqcq\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:53 crc kubenswrapper[4773]: I1012 20:37:53.488939 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:37:54 crc kubenswrapper[4773]: E1012 20:37:54.675706 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8" Oct 12 20:37:54 crc kubenswrapper[4773]: E1012 20:37:54.675979 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6v6z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5f67fbc655-jh557_openstack-operators(2e680c12-2026-4296-8ffa-d0185c12d2c1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:37:55 crc kubenswrapper[4773]: E1012 20:37:55.242943 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7" Oct 12 20:37:55 crc kubenswrapper[4773]: E1012 20:37:55.243160 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pzhh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69fdcfc5f5-nvvg8_openstack-operators(095b027c-fb46-4d19-bbcf-84871f8c90f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:37:56 crc kubenswrapper[4773]: E1012 20:37:56.532561 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351" Oct 12 20:37:56 crc kubenswrapper[4773]: E1012 20:37:56.534841 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:03b4f3db4b373515f7e4095984b97197c05a14f87b2a0a525eb5d7be1d7bda66,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:6722a752fb7cbffbae811f6ad6567120fbd4ebbe8c38a83ec2df02850a3276bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:2115452234aedb505ed4efc6cd9b9a4ce3b9809aa7d0128d8fbeeee84dad1a69,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:50597a8eaa6c4383f357574dcab8358b698729797b4156d932985a08ab86b7cd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:cb4997d62c7b2534233a676cb92e19cf85dda07e2fb9fa642c28aab30489f69a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:1ccbf3f6cf24c9ee91bed71467491e22b8cb4b95bce90250f4174fae936b0fa1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:cbe345acb37e57986ecf6685d28c72d0e639bdb493a18e9d3ba947d6c3a16384,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:e7dcc3bf23d5e0393ac173e3c43d4ae85f4613a4fd16b3c147dc32ae491d49bf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:2a1a8b582c6e4cc31081bd8b0887acf45e31c1d14596c4e361d27d08fef0debf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:86daeb9c834bfcedb533086dff59a6b5b6e832b94ce2a9116337f8736bb80032,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:6d28de018f6e1672e775a75735e3bc16b63da41acd8fb5196ee0b06856c07133,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:c5fc9b72fc593bcf3b569c7ed24a256448eb1afab1504e668a3822e978be1306,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:88b99249f15470f359fb554f7f3a56974b743f4655e3f0c982c0260f75a67697,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:e861d66785047d39eb68d9bac23e3f57ac84d9bd95593502d9b3b913b99fd1a4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:b95f09bf3d259f9eacf3b63931977483f5c3c332f49b95ee8a69d8e3fb71d082,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:6fc7801c0d18d41b9f11484b1cdb342de9cebd93072ec2205dbe40945715184f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:d4d824b80cbed683543d9e8c7045ac97e080774f45a5067ccbca26404e067821,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:182ec75938d8d3fb7d8f916373368add24062fec90489aa57776a81d0b36ea20,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:9507ba5ab74cbae902e2dc07f89c7b3b5b76d8079e444365fe0eee6000fd7aaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:17db080dcc4099f8a20aa0f238b6bca5c104672ae46743adeab9d1637725ecaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:fd55cf3d73bfdc518419c9ba0b0cbef275140ae2d3bd0342a7310f81d57c2d78,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:d164a9bd383f50df69fc22e7422f4650cd5076c90ed19278fc0f04e54345a63d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:6beffe7d0bd75f9d1f495aeb7ab2334a2414af2c581d4833363df8441ed01018,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:2308c7b6c3d0aabbadfc9a06d84d67d2243f27fe8eed740ee96b1ce910203f62,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:9cf0ca292340f1f978603955ef682effbf24316d6e2376b1c89906d84c3f06d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:58f678016d7f6c8fe579abe886fd138ef853642faa6766ca60639feac12d82ac,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:46f92909153aaf03a585374b77d103c536509747e3270558d9a533295c46a7c5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:7fe367f51638c5c302fd3f8e66a31b09cb3b11519a7f72ef142b6c6fe8b91694,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:9ebf424d4107275a2e3f21f7a18ef257ff2f97c1298109ac7c802a5a4f4794f2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:4fcbe0d9a3c845708ecc32102ad4abbcbd947d87e5cf91f186de75b5d84ec681,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:58a4e9a4dea86635c93ce37a2bb3c60ece62b3d656f6ee6a8845347cbb3e90fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:6f2b843bc9f4ceb1ee873972d69e6bae6e1dbd378b486995bc3697d8bcff6339,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:03b4bb79b71d5ca7792d19c4c0ee08a5e5a407ad844c087305c42dd909ee7490,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:773daada6402d9cad089cdc809d6c0335456d057ac1a25441ab5d82add2f70f4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:7323406a63fb3fdbb3eea4da0f7e8ed89c94c9bd0ad5ecd6c18fa4a4c2c550c4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:7ae82068011e2d2e5ddc88c943fd32ff4a11902793e7a1df729811b2e27122a0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:0c762c15d9d98d39cc9dc3d1f9a70f9188fef58d4e2f3b0c69c896cab8da5e48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:febf65561eeef5b36b70d0d65ee83f6451e43ec97bfab4d826e14215da6ff19b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:b8aadfc3d547c5ef1e27fcb573d4760cf8c2f2271eefe1793c35a0d46b640837,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:ecc91fd5079ee6d0c6ae1b11e97da790e33864d0e1930e574f959da2bddfa59a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:2e981e93f99c929a3f04e5e41c8f645d44d390a9aeee3c5193cce7ec2edcbf3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:1e5714637b6e1a24c2858fe6d9bbb3f00bc61d69ad74a657b1c23682bf4cb2b7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:35b8dcf27dc3b67f3840fa0e693ff312f74f7e22c634dff206a5c4d0133c716c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:e109e4863e05e803dbfe04917756fd52231c560c65353170a2000be6cc2bb53d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:6df0bebd9318ce11624413249e7e9781311638f276f8877668d3b382fe90e62f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:56b75d97f4a48c8cf58b3a7c18c43618efb308bf0188124f6301142e61299b0c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:a51ed62767206067aa501142dbf01f20b3d65325d30faf1b4d6424d5b17dfba5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:592e3cd32d3cc97a69093ad905b449aa374ffbb1b2644b738bb6c1434476d1f6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:9596452e283febbe08204d0ef0fd1992af3395d0969f7ac76663ed7c8be5b4d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:d61005a10bef1b37762a8a41e6755c1169241e36cc5f92886bca6f4f6b9c381a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:e6a4335bcbeed3cd3e73ac879f754e314761e4a417a67539ca88e96a79346328,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:97d88fc53421b699fc91983313d7beec4a0f177089e95bdf5ba15c3f521db9a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:5365e5c9c3ad2ede1b6945255b2cc6b009d642c39babdf25e0655282cfa646fe,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:5b55795d774e0ea160ff8a7fd491ed41cf2d93c7d821694abb3a879eaffcefeb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:26e955c46a6063eafcfeb79430bf3d9268dbe95687c00e63a624b3ec5a846f5a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:58939baa18ab09e2b24996c5f3665ae52274b781f661ea06a67c991e9a832d5a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:b8bff6857fec93c3c1521f1a8c23de21bcb86fc0f960972e81f6c3f95d4185be,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:943eee724277e252795909137538a553ef5284c8103ad01b9be7b0138c66d14d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:d97b08fd421065c8c33a523973822ac468500cbe853069aa9214393fbda7a908,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:289dea3beea1cd4405895fc42e44372b35e4a941e31c59e102c333471a3ca9b7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:9b19894fa67a81bf8ba4159b55b49f38877c670aeb97e2021c341cef2a9294e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:ea164961ad30453ad0301c6b73364e1f1024f689634c88dd98265f9c7048e31d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:6f9f2ea45f0271f6da8eb05a5f74cf5ce6769479346f5c2f407ee6f31a9c7ff3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:59448516174fc3bab679b9a8dd62cb9a9d16b5734aadbeb98e960e3b7c79bd22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:adcdeb8ecd601fb03c3b0901d5b5111af2ca48f7dd443e22224db6daaf08f5d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:2bf32d9b95899d7637dfe19d07cf1ecc9a06593984faff57a3c0dce060012edb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:7a452cd18b64d522e8a1e25bdcea543e9fe5f5b76e1c5e044c2b5334e06a326b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:6a46aa13aa359b8e782a22d67db42db02bbf2bb7e35df4b684ac1daeda38cde3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:f6824854bea6b2acbb00c34639799b4744818d4adbdd40e37dc5088f9ae18d58,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:a66d2fdc21f25c690f02e643d2666dbe7df43a64cd55086ec33d6755e6d809b9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:30701a65382430570f6fb35621f64f1003f727b6da745ce84fb1a90436ee2350,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:b9a657c51bbcc236e6c906a6df6c42cd2a28bab69e7ab58b0e9ced12295b2d87,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:fd65fb5c9710c46aa1c31e65a51cd5c23ec35cf68c2452d421f919f2aa9b6255,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n9vmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf_openstack-operators(b9e5880b-293d-4311-8928-f93649649c93): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:37:57 crc kubenswrapper[4773]: E1012 20:37:57.071115 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492" Oct 12 20:37:57 crc kubenswrapper[4773]: E1012 20:37:57.071307 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lw8db,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-656bcbd775-8j4jx_openstack-operators(6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:37:58 crc kubenswrapper[4773]: E1012 20:37:58.436870 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:ec11cb8711bd1af22db3c84aa854349ee46191add3db45aecfabb1d8410c04d0" Oct 12 20:37:58 crc kubenswrapper[4773]: E1012 20:37:58.437227 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:ec11cb8711bd1af22db3c84aa854349ee46191add3db45aecfabb1d8410c04d0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nnd8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-858f76bbdd-xnqzt_openstack-operators(e963f42c-7955-4378-927e-1ab264a6116e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:37:58 crc kubenswrapper[4773]: E1012 20:37:58.963678 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34" Oct 12 20:37:58 crc kubenswrapper[4773]: E1012 20:37:58.964274 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6b2gx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-85d5d9dd78-sfgw7_openstack-operators(69dd4207-8b02-4a43-bc3a-9c939881422f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:38:00 crc kubenswrapper[4773]: E1012 20:38:00.250853 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:79b43a69884631c635d2164b95a2d4ec68f5cb33f96da14764f1c710880f3997" Oct 12 20:38:00 crc kubenswrapper[4773]: E1012 20:38:00.251128 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:79b43a69884631c635d2164b95a2d4ec68f5cb33f96da14764f1c710880f3997,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m4w5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-55b6b7c7b8-sc6z2_openstack-operators(5321f2fd-a14c-4a48-be68-bdbefe80aa8d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:38:01 crc kubenswrapper[4773]: E1012 20:38:01.608976 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:ee05f2b06405240a8fcdbd430a9e8983b4667f372548334307b68c154e389960" Oct 12 20:38:01 crc kubenswrapper[4773]: E1012 20:38:01.609195 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:ee05f2b06405240a8fcdbd430a9e8983b4667f372548334307b68c154e389960,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqcxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-9c5c78d49-fqqwc_openstack-operators(83700a3c-4ccd-4ac6-8c0a-c530623ffdfe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:38:02 crc kubenswrapper[4773]: E1012 20:38:02.139931 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e" Oct 12 20:38:02 crc kubenswrapper[4773]: E1012 20:38:02.140163 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cgwwm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-db6d7f97b-vx9cr_openstack-operators(b2ec8f8f-d841-4683-86ed-54ec360d9ec1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.337983 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4k6v"] Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.339345 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.344218 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4k6v"] Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.412927 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf28c\" (UniqueName: \"kubernetes.io/projected/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-kube-api-access-wf28c\") pod \"certified-operators-p4k6v\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.413021 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-catalog-content\") pod \"certified-operators-p4k6v\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.413041 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-utilities\") pod \"certified-operators-p4k6v\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.514123 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-catalog-content\") pod \"certified-operators-p4k6v\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.514162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-utilities\") pod \"certified-operators-p4k6v\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.514229 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf28c\" (UniqueName: \"kubernetes.io/projected/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-kube-api-access-wf28c\") pod \"certified-operators-p4k6v\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.515253 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-utilities\") pod \"certified-operators-p4k6v\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.515766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-catalog-content\") pod \"certified-operators-p4k6v\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.531478 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf28c\" (UniqueName: \"kubernetes.io/projected/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-kube-api-access-wf28c\") pod \"certified-operators-p4k6v\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:02 crc kubenswrapper[4773]: I1012 20:38:02.702030 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:03 crc kubenswrapper[4773]: E1012 20:38:03.512956 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a" Oct 12 20:38:03 crc kubenswrapper[4773]: E1012 20:38:03.513132 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6t7z4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5458f77c4-hz6mw_openstack-operators(e38708a6-e3b7-407d-8fe5-f27cd9a69f76): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.474998 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-82kvr"] Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.477097 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.495241 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82kvr"] Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.579292 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4l8\" (UniqueName: \"kubernetes.io/projected/c86df0c4-a7dc-4309-9fc1-faac3382346c-kube-api-access-8k4l8\") pod \"redhat-marketplace-82kvr\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.579342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-catalog-content\") pod \"redhat-marketplace-82kvr\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.579374 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-utilities\") pod \"redhat-marketplace-82kvr\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.680905 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4l8\" (UniqueName: \"kubernetes.io/projected/c86df0c4-a7dc-4309-9fc1-faac3382346c-kube-api-access-8k4l8\") pod \"redhat-marketplace-82kvr\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.680965 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-catalog-content\") pod \"redhat-marketplace-82kvr\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.680994 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-utilities\") pod \"redhat-marketplace-82kvr\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.681580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-utilities\") pod \"redhat-marketplace-82kvr\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.682233 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-catalog-content\") pod \"redhat-marketplace-82kvr\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.715201 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4l8\" (UniqueName: \"kubernetes.io/projected/c86df0c4-a7dc-4309-9fc1-faac3382346c-kube-api-access-8k4l8\") pod \"redhat-marketplace-82kvr\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:07 crc kubenswrapper[4773]: I1012 20:38:07.804651 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:08 crc kubenswrapper[4773]: I1012 20:38:08.199667 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pzqcq"] Oct 12 20:38:08 crc kubenswrapper[4773]: I1012 20:38:08.231910 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" event={"ID":"7dc3b970-233d-4af3-a341-8297af5433bc","Type":"ContainerStarted","Data":"9fca2414e320b0b604b48b591696a1b8fb347e89ae099be45efdc3726e9879d0"} Oct 12 20:38:08 crc kubenswrapper[4773]: I1012 20:38:08.250912 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" event={"ID":"3bff7ce4-adb2-494b-8644-f8e7568efa62","Type":"ContainerStarted","Data":"7b4b1004b406e63fcf282942e273dd20101d29560745ab7697d2d69e41413cff"} Oct 12 20:38:08 crc kubenswrapper[4773]: I1012 20:38:08.253986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" event={"ID":"e3a81848-dc85-44b3-addf-35cb34c1e85a","Type":"ContainerStarted","Data":"4de1f44d420bc8016b4dbc6e7225460922782cacbec625d1bec2966bd7d78027"} Oct 12 20:38:08 crc kubenswrapper[4773]: I1012 20:38:08.307666 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4k6v"] Oct 12 20:38:08 crc kubenswrapper[4773]: W1012 20:38:08.385391 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50e96b5e_1264_4224_917b_dccb3e792b70.slice/crio-dfa01ec59db686d436e6b215bf63fa0bcd6df23a360a84e1f48fc84c9ed30dec WatchSource:0}: Error finding container dfa01ec59db686d436e6b215bf63fa0bcd6df23a360a84e1f48fc84c9ed30dec: Status 404 returned error can't find the container with id dfa01ec59db686d436e6b215bf63fa0bcd6df23a360a84e1f48fc84c9ed30dec Oct 12 20:38:08 crc kubenswrapper[4773]: I1012 20:38:08.490482 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82kvr"] Oct 12 20:38:08 crc kubenswrapper[4773]: W1012 20:38:08.561987 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86df0c4_a7dc_4309_9fc1_faac3382346c.slice/crio-0755fa685a06d5d90c746461eec544689337071843e0274d9e49d167169b5394 WatchSource:0}: Error finding container 0755fa685a06d5d90c746461eec544689337071843e0274d9e49d167169b5394: Status 404 returned error can't find the container with id 0755fa685a06d5d90c746461eec544689337071843e0274d9e49d167169b5394 Oct 12 20:38:08 crc kubenswrapper[4773]: E1012 20:38:08.695031 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" podUID="b9e5880b-293d-4311-8928-f93649649c93" Oct 12 20:38:08 crc kubenswrapper[4773]: E1012 20:38:08.714459 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" podUID="e38708a6-e3b7-407d-8fe5-f27cd9a69f76" Oct 12 20:38:08 crc kubenswrapper[4773]: E1012 20:38:08.777162 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" podUID="2e680c12-2026-4296-8ffa-d0185c12d2c1" Oct 12 20:38:08 crc kubenswrapper[4773]: E1012 20:38:08.995701 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" podUID="69dd4207-8b02-4a43-bc3a-9c939881422f" Oct 12 20:38:09 crc kubenswrapper[4773]: E1012 20:38:09.014558 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" podUID="b2ec8f8f-d841-4683-86ed-54ec360d9ec1" Oct 12 20:38:09 crc kubenswrapper[4773]: E1012 20:38:09.084876 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" podUID="5321f2fd-a14c-4a48-be68-bdbefe80aa8d" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.272162 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" event={"ID":"69dd4207-8b02-4a43-bc3a-9c939881422f","Type":"ContainerStarted","Data":"d64c8b91092a805fbc98adf680230529585e3d0439d7bfaa6e9949bf1b8de426"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.292557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" event={"ID":"843b5d05-f35d-4632-8781-4c60ed803cb6","Type":"ContainerStarted","Data":"67499bcd684d0229956a70b02c5cadabf6ffcf4883051da81b10b61129ab1e14"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.294939 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4k6v" event={"ID":"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4","Type":"ContainerStarted","Data":"0dd591e5cca2a81846e0500184a4fc40ae3e74a13d6a78e4bf416175ae6d21cd"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.296311 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" event={"ID":"38cef8bd-b25e-47aa-8f3f-9af1289f72f8","Type":"ContainerStarted","Data":"83e090d072d19fa1fbf433d9c3607d4644a4567523a021555912fac55d063c1c"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.296805 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.300697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" event={"ID":"e38708a6-e3b7-407d-8fe5-f27cd9a69f76","Type":"ContainerStarted","Data":"ff9fe87d96c109c8bfdc070862585fdd1ad6fdc3159ae99de0170efc854f196e"} Oct 12 20:38:09 crc kubenswrapper[4773]: E1012 20:38:09.303007 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" podUID="e38708a6-e3b7-407d-8fe5-f27cd9a69f76" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.305486 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" event={"ID":"ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a","Type":"ContainerStarted","Data":"8c3ae3374136f9155b48029f737bce4bdeb880cdacd4b9998177444541ca5d29"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.306830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" event={"ID":"f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b","Type":"ContainerStarted","Data":"15c543de38381a73174aa5b5406361ace56c67aee989b419830ffb4d64a30994"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.323191 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82kvr" event={"ID":"c86df0c4-a7dc-4309-9fc1-faac3382346c","Type":"ContainerStarted","Data":"0755fa685a06d5d90c746461eec544689337071843e0274d9e49d167169b5394"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.325580 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pzqcq" event={"ID":"50e96b5e-1264-4224-917b-dccb3e792b70","Type":"ContainerStarted","Data":"dfa01ec59db686d436e6b215bf63fa0bcd6df23a360a84e1f48fc84c9ed30dec"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.330238 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" event={"ID":"293153be-33db-41ba-a589-55a17026c756","Type":"ContainerStarted","Data":"9004bee82696ad7440546970565866df11fece2c9d1f9c46401ef40acf084300"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.330929 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.334278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" event={"ID":"5321f2fd-a14c-4a48-be68-bdbefe80aa8d","Type":"ContainerStarted","Data":"cf2a76060fac4488ecc81fb618d1f5848c9fa7a4f32564d62996403cc3105733"} Oct 12 20:38:09 crc kubenswrapper[4773]: E1012 20:38:09.335945 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:79b43a69884631c635d2164b95a2d4ec68f5cb33f96da14764f1c710880f3997\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" podUID="5321f2fd-a14c-4a48-be68-bdbefe80aa8d" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.338496 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" event={"ID":"2b083bd3-8fe4-44c8-8d3e-f736260b8210","Type":"ContainerStarted","Data":"30077fafcd8430fb41ae14afdf2fb13c9c3782710bd30c78505dd2c94f037516"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.339250 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.341388 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" event={"ID":"34c81f6e-1829-4f0f-a0aa-951b4d4f41c4","Type":"ContainerStarted","Data":"2244c0879f8e097b3bc8afafb2a5a343a3298d3bf84cbb9f7e33c95b8babfd88"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.342308 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.343203 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" event={"ID":"2e680c12-2026-4296-8ffa-d0185c12d2c1","Type":"ContainerStarted","Data":"a71ae75137a22dcc277e38cd832e283a5955ae50284057453a643057d055ef46"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.344330 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" event={"ID":"b9e5880b-293d-4311-8928-f93649649c93","Type":"ContainerStarted","Data":"f2ce447cdfecddf9f9249ba922eb69f9f4a8cf37def7b9117746398f4d15134e"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.353873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" event={"ID":"f7349c73-c56a-4e87-8618-dea521d99b95","Type":"ContainerStarted","Data":"e58bb8349c7bf26f13ba7115b296aee4ffa96d559b363d48953bb3df60212e7e"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.377640 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" event={"ID":"9392e042-5a5f-47d2-9232-3fa47cce88f3","Type":"ContainerStarted","Data":"cddca8234b5b5fc65809c10278515ee90bc687301c3a1daa03a1eb646120b366"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.382036 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" event={"ID":"b2ec8f8f-d841-4683-86ed-54ec360d9ec1","Type":"ContainerStarted","Data":"5f1bb2fe1c9338dcbbddbca0699599a2826006d1dc1ccee5aa2d638e3c782aaf"} Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.387567 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" podStartSLOduration=4.87160962 podStartE2EDuration="30.387535989s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.324850321 +0000 UTC m=+810.561148881" lastFinishedPulling="2025-10-12 20:38:07.8407767 +0000 UTC m=+836.077075250" observedRunningTime="2025-10-12 20:38:09.383135656 +0000 UTC m=+837.619434216" watchObservedRunningTime="2025-10-12 20:38:09.387535989 +0000 UTC m=+837.623834549" Oct 12 20:38:09 crc kubenswrapper[4773]: E1012 20:38:09.387900 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" podUID="b2ec8f8f-d841-4683-86ed-54ec360d9ec1" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.423166 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" podStartSLOduration=4.858953778 podStartE2EDuration="30.423147789s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.362131198 +0000 UTC m=+810.598429758" lastFinishedPulling="2025-10-12 20:38:07.926325209 +0000 UTC m=+836.162623769" observedRunningTime="2025-10-12 20:38:09.415187158 +0000 UTC m=+837.651485718" watchObservedRunningTime="2025-10-12 20:38:09.423147789 +0000 UTC m=+837.659446339" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.465410 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" podStartSLOduration=4.8155202599999996 podStartE2EDuration="30.465391474s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.254132054 +0000 UTC m=+810.490430614" lastFinishedPulling="2025-10-12 20:38:07.904003258 +0000 UTC m=+836.140301828" observedRunningTime="2025-10-12 20:38:09.463615925 +0000 UTC m=+837.699914485" watchObservedRunningTime="2025-10-12 20:38:09.465391474 +0000 UTC m=+837.701690034" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.574704 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm" podStartSLOduration=3.959654649 podStartE2EDuration="29.574685614s" podCreationTimestamp="2025-10-12 20:37:40 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.323973797 +0000 UTC m=+810.560272357" lastFinishedPulling="2025-10-12 20:38:07.939004762 +0000 UTC m=+836.175303322" observedRunningTime="2025-10-12 20:38:09.573574413 +0000 UTC m=+837.809872973" watchObservedRunningTime="2025-10-12 20:38:09.574685614 +0000 UTC m=+837.810984174" Oct 12 20:38:09 crc kubenswrapper[4773]: I1012 20:38:09.597005 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" podStartSLOduration=5.023970418 podStartE2EDuration="30.596990324s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.364387091 +0000 UTC m=+810.600685651" lastFinishedPulling="2025-10-12 20:38:07.937406997 +0000 UTC m=+836.173705557" observedRunningTime="2025-10-12 20:38:09.589593759 +0000 UTC m=+837.825892319" watchObservedRunningTime="2025-10-12 20:38:09.596990324 +0000 UTC m=+837.833288874" Oct 12 20:38:09 crc kubenswrapper[4773]: E1012 20:38:09.624809 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" podUID="095b027c-fb46-4d19-bbcf-84871f8c90f7" Oct 12 20:38:09 crc kubenswrapper[4773]: E1012 20:38:09.820607 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" podUID="e963f42c-7955-4378-927e-1ab264a6116e" Oct 12 20:38:09 crc kubenswrapper[4773]: E1012 20:38:09.873667 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" podUID="83700a3c-4ccd-4ac6-8c0a-c530623ffdfe" Oct 12 20:38:09 crc kubenswrapper[4773]: E1012 20:38:09.909935 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" podUID="6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.414092 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" event={"ID":"e963f42c-7955-4378-927e-1ab264a6116e","Type":"ContainerStarted","Data":"3268168b8f4836734a480f074d6b615c1c0e2b916abcc99536b5fd056a7004c5"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.423909 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" event={"ID":"843b5d05-f35d-4632-8781-4c60ed803cb6","Type":"ContainerStarted","Data":"aa8b33048e3e324e87d1edcf016493cf3a0ef303920d4fbc75f3ae0d21938835"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.426739 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" event={"ID":"095b027c-fb46-4d19-bbcf-84871f8c90f7","Type":"ContainerStarted","Data":"dbdf0a058bc96b8222e4d6a7196deb3ccc8202560d2b7db441e3b57183b65d90"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.426937 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.431624 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" event={"ID":"9392e042-5a5f-47d2-9232-3fa47cce88f3","Type":"ContainerStarted","Data":"b454d7a2011546ee13768b38714861f0eabeceefb03dcc8a25bf90a7d5ec2822"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.432068 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.461976 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" event={"ID":"7dc3b970-233d-4af3-a341-8297af5433bc","Type":"ContainerStarted","Data":"00bf232f9f4cf1e5f58364a922ac50b205ce1b4a1221e22bb3b6a06ec3d026fb"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.462243 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.481569 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" event={"ID":"83700a3c-4ccd-4ac6-8c0a-c530623ffdfe","Type":"ContainerStarted","Data":"b24ef830056185a25b4d8f6048f156f0132faadd8f08f63c7ab9d78657d8c391"} Oct 12 20:38:10 crc kubenswrapper[4773]: E1012 20:38:10.484918 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:ee05f2b06405240a8fcdbd430a9e8983b4667f372548334307b68c154e389960\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" podUID="83700a3c-4ccd-4ac6-8c0a-c530623ffdfe" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.496000 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" event={"ID":"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17","Type":"ContainerStarted","Data":"b63eb09685b931765e74e7679a3294db9935972dc2b0f5516db015d3bb3a5987"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.501245 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" event={"ID":"3bff7ce4-adb2-494b-8644-f8e7568efa62","Type":"ContainerStarted","Data":"c443080561dd4bb4f77578c088d02977024b46cfa312b66d6aa5d4e7e52d9744"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.501828 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.509783 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" podStartSLOduration=8.800265895999999 podStartE2EDuration="31.509766401s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.075678891 +0000 UTC m=+810.311977451" lastFinishedPulling="2025-10-12 20:38:04.785179396 +0000 UTC m=+833.021477956" observedRunningTime="2025-10-12 20:38:10.50650117 +0000 UTC m=+838.742799730" watchObservedRunningTime="2025-10-12 20:38:10.509766401 +0000 UTC m=+838.746064961" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.512139 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" event={"ID":"e3a81848-dc85-44b3-addf-35cb34c1e85a","Type":"ContainerStarted","Data":"2c4495ba71849603ffa41fb12e43375f5778bf2d1704afe6ed08f7b328f70e9f"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.512814 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.514131 4773 generic.go:334] "Generic (PLEG): container finished" podID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerID="638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3" exitCode=0 Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.514578 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4k6v" event={"ID":"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4","Type":"ContainerDied","Data":"638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.523821 4773 generic.go:334] "Generic (PLEG): container finished" podID="50e96b5e-1264-4224-917b-dccb3e792b70" containerID="bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f" exitCode=0 Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.523909 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pzqcq" event={"ID":"50e96b5e-1264-4224-917b-dccb3e792b70","Type":"ContainerDied","Data":"bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.532148 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" event={"ID":"ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a","Type":"ContainerStarted","Data":"58c3e26e3ca96ac6ca9b01167746593a4c2526b8b85ae63b6f070582cbafe1db"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.532935 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.557171 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" event={"ID":"2e680c12-2026-4296-8ffa-d0185c12d2c1","Type":"ContainerStarted","Data":"1ae6409d23ebefda75774134eaf44cf3f64da20e940f56356d061bfd8e43bfa4"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.557915 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.568087 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" podStartSLOduration=7.930784874 podStartE2EDuration="31.568071753s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:41.144163213 +0000 UTC m=+809.380461773" lastFinishedPulling="2025-10-12 20:38:04.781450092 +0000 UTC m=+833.017748652" observedRunningTime="2025-10-12 20:38:10.564482493 +0000 UTC m=+838.800781053" watchObservedRunningTime="2025-10-12 20:38:10.568071753 +0000 UTC m=+838.804370313" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.579811 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" event={"ID":"69dd4207-8b02-4a43-bc3a-9c939881422f","Type":"ContainerStarted","Data":"f787f0e39d86f0d5e94bda2c74a8b725cbb39485894733242ebba84f16160f22"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.580604 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.595763 4773 generic.go:334] "Generic (PLEG): container finished" podID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerID="698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b" exitCode=0 Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.596015 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82kvr" event={"ID":"c86df0c4-a7dc-4309-9fc1-faac3382346c","Type":"ContainerDied","Data":"698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.598991 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" event={"ID":"f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b","Type":"ContainerStarted","Data":"0b0027cb9cc7559719ac1657d3f6fa6d8f80fe09833c2a2d8e1b54dbc558d24a"} Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.599655 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.612131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" event={"ID":"b9e5880b-293d-4311-8928-f93649649c93","Type":"ContainerStarted","Data":"79a8d44a31ce32217eed6f5c8066534c16bdb500eb6995bec3cbab4717081803"} Oct 12 20:38:10 crc kubenswrapper[4773]: E1012 20:38:10.612693 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" podUID="b2ec8f8f-d841-4683-86ed-54ec360d9ec1" Oct 12 20:38:10 crc kubenswrapper[4773]: E1012 20:38:10.613098 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" podUID="e38708a6-e3b7-407d-8fe5-f27cd9a69f76" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.673003 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" podStartSLOduration=7.878098888 podStartE2EDuration="31.672987951s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:40.988984317 +0000 UTC m=+809.225282877" lastFinishedPulling="2025-10-12 20:38:04.78387338 +0000 UTC m=+833.020171940" observedRunningTime="2025-10-12 20:38:10.639951932 +0000 UTC m=+838.876250492" watchObservedRunningTime="2025-10-12 20:38:10.672987951 +0000 UTC m=+838.909286511" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.673514 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" podStartSLOduration=8.8816717 podStartE2EDuration="31.673510075s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:41.994066121 +0000 UTC m=+810.230364681" lastFinishedPulling="2025-10-12 20:38:04.785904496 +0000 UTC m=+833.022203056" observedRunningTime="2025-10-12 20:38:10.673067993 +0000 UTC m=+838.909366553" watchObservedRunningTime="2025-10-12 20:38:10.673510075 +0000 UTC m=+838.909808635" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.711664 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" podStartSLOduration=7.465630976 podStartE2EDuration="31.711649156s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:40.937435733 +0000 UTC m=+809.173734293" lastFinishedPulling="2025-10-12 20:38:05.183453913 +0000 UTC m=+833.419752473" observedRunningTime="2025-10-12 20:38:10.70962226 +0000 UTC m=+838.945920820" watchObservedRunningTime="2025-10-12 20:38:10.711649156 +0000 UTC m=+838.947947706" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.761809 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" podStartSLOduration=8.43225538 podStartE2EDuration="31.761795081s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:41.453683661 +0000 UTC m=+809.689982211" lastFinishedPulling="2025-10-12 20:38:04.783223352 +0000 UTC m=+833.019521912" observedRunningTime="2025-10-12 20:38:10.758452538 +0000 UTC m=+838.994751098" watchObservedRunningTime="2025-10-12 20:38:10.761795081 +0000 UTC m=+838.998093641" Oct 12 20:38:10 crc kubenswrapper[4773]: I1012 20:38:10.881582 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" podStartSLOduration=3.714801886 podStartE2EDuration="31.881563822s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:41.78838381 +0000 UTC m=+810.024682370" lastFinishedPulling="2025-10-12 20:38:09.955145746 +0000 UTC m=+838.191444306" observedRunningTime="2025-10-12 20:38:10.875852113 +0000 UTC m=+839.112150683" watchObservedRunningTime="2025-10-12 20:38:10.881563822 +0000 UTC m=+839.117862382" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.046877 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" podStartSLOduration=3.219503821 podStartE2EDuration="32.046860289s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:40.941586959 +0000 UTC m=+809.177885519" lastFinishedPulling="2025-10-12 20:38:09.768943427 +0000 UTC m=+838.005241987" observedRunningTime="2025-10-12 20:38:10.982090608 +0000 UTC m=+839.218389168" watchObservedRunningTime="2025-10-12 20:38:11.046860289 +0000 UTC m=+839.283158849" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.047247 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" podStartSLOduration=4.452300388 podStartE2EDuration="32.04724361s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.359160755 +0000 UTC m=+810.595459315" lastFinishedPulling="2025-10-12 20:38:09.954103977 +0000 UTC m=+838.190402537" observedRunningTime="2025-10-12 20:38:11.043509326 +0000 UTC m=+839.279807886" watchObservedRunningTime="2025-10-12 20:38:11.04724361 +0000 UTC m=+839.283542170" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.086560 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" podStartSLOduration=9.114454334 podStartE2EDuration="32.086541503s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:41.777637971 +0000 UTC m=+810.013936531" lastFinishedPulling="2025-10-12 20:38:04.74972514 +0000 UTC m=+832.986023700" observedRunningTime="2025-10-12 20:38:11.085188295 +0000 UTC m=+839.321486855" watchObservedRunningTime="2025-10-12 20:38:11.086541503 +0000 UTC m=+839.322840063" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.114841 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.619601 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" event={"ID":"095b027c-fb46-4d19-bbcf-84871f8c90f7","Type":"ContainerStarted","Data":"e493a703bd30f777e2b14fd2b3b22986de82f155101d3003bacceb38f444770f"} Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.619685 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.621609 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" event={"ID":"5321f2fd-a14c-4a48-be68-bdbefe80aa8d","Type":"ContainerStarted","Data":"cef6c6f62ff11ade535b83423e8ed5e6babb3ee9260bc6eff2f1ffbed7c92877"} Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.621767 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.622938 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4k6v" event={"ID":"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4","Type":"ContainerStarted","Data":"ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707"} Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.624528 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82kvr" event={"ID":"c86df0c4-a7dc-4309-9fc1-faac3382346c","Type":"ContainerStarted","Data":"64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834"} Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.625713 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" event={"ID":"6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17","Type":"ContainerStarted","Data":"2bb06ac0f79d6c37e5fffc2f0c3a2fa3a106a8f49e008e2283e2672bad5a7a01"} Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.625858 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.627492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" event={"ID":"e963f42c-7955-4378-927e-1ab264a6116e","Type":"ContainerStarted","Data":"237cf9ef81f4eea522c805b489ac6189e4116de9e2c356d9bdf4bc91724a0b68"} Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.627623 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.628973 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pzqcq" event={"ID":"50e96b5e-1264-4224-917b-dccb3e792b70","Type":"ContainerStarted","Data":"e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a"} Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.684743 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" podStartSLOduration=3.8115873689999997 podStartE2EDuration="32.68471308s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:41.994002499 +0000 UTC m=+810.230301059" lastFinishedPulling="2025-10-12 20:38:10.86712821 +0000 UTC m=+839.103426770" observedRunningTime="2025-10-12 20:38:11.672686955 +0000 UTC m=+839.908985515" watchObservedRunningTime="2025-10-12 20:38:11.68471308 +0000 UTC m=+839.921011640" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.777208 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" podStartSLOduration=3.319996776 podStartE2EDuration="32.777195092s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:41.41048556 +0000 UTC m=+809.646784120" lastFinishedPulling="2025-10-12 20:38:10.867683886 +0000 UTC m=+839.103982436" observedRunningTime="2025-10-12 20:38:11.77499201 +0000 UTC m=+840.011290570" watchObservedRunningTime="2025-10-12 20:38:11.777195092 +0000 UTC m=+840.013493642" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.805390 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" podStartSLOduration=3.625056021 podStartE2EDuration="32.805374436s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.037789247 +0000 UTC m=+810.274087807" lastFinishedPulling="2025-10-12 20:38:11.218107662 +0000 UTC m=+839.454406222" observedRunningTime="2025-10-12 20:38:11.794234646 +0000 UTC m=+840.030533206" watchObservedRunningTime="2025-10-12 20:38:11.805374436 +0000 UTC m=+840.041672996" Oct 12 20:38:11 crc kubenswrapper[4773]: I1012 20:38:11.840111 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" podStartSLOduration=3.384085478 podStartE2EDuration="32.840097181s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:41.772301103 +0000 UTC m=+810.008599663" lastFinishedPulling="2025-10-12 20:38:11.228312806 +0000 UTC m=+839.464611366" observedRunningTime="2025-10-12 20:38:11.836270565 +0000 UTC m=+840.072569125" watchObservedRunningTime="2025-10-12 20:38:11.840097181 +0000 UTC m=+840.076395741" Oct 12 20:38:12 crc kubenswrapper[4773]: I1012 20:38:12.653166 4773 generic.go:334] "Generic (PLEG): container finished" podID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerID="ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707" exitCode=0 Oct 12 20:38:12 crc kubenswrapper[4773]: I1012 20:38:12.653492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4k6v" event={"ID":"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4","Type":"ContainerDied","Data":"ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707"} Oct 12 20:38:12 crc kubenswrapper[4773]: I1012 20:38:12.662064 4773 generic.go:334] "Generic (PLEG): container finished" podID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerID="64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834" exitCode=0 Oct 12 20:38:12 crc kubenswrapper[4773]: I1012 20:38:12.662146 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82kvr" event={"ID":"c86df0c4-a7dc-4309-9fc1-faac3382346c","Type":"ContainerDied","Data":"64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834"} Oct 12 20:38:12 crc kubenswrapper[4773]: I1012 20:38:12.678123 4773 generic.go:334] "Generic (PLEG): container finished" podID="50e96b5e-1264-4224-917b-dccb3e792b70" containerID="e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a" exitCode=0 Oct 12 20:38:12 crc kubenswrapper[4773]: I1012 20:38:12.682129 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pzqcq" event={"ID":"50e96b5e-1264-4224-917b-dccb3e792b70","Type":"ContainerDied","Data":"e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a"} Oct 12 20:38:13 crc kubenswrapper[4773]: I1012 20:38:13.685517 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82kvr" event={"ID":"c86df0c4-a7dc-4309-9fc1-faac3382346c","Type":"ContainerStarted","Data":"ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882"} Oct 12 20:38:13 crc kubenswrapper[4773]: I1012 20:38:13.689009 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" event={"ID":"83700a3c-4ccd-4ac6-8c0a-c530623ffdfe","Type":"ContainerStarted","Data":"bfbd6710824f54f775982cc0fbe30c70c25c539899a95c483290d9456765ce58"} Oct 12 20:38:13 crc kubenswrapper[4773]: I1012 20:38:13.689535 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" Oct 12 20:38:13 crc kubenswrapper[4773]: I1012 20:38:13.691246 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pzqcq" event={"ID":"50e96b5e-1264-4224-917b-dccb3e792b70","Type":"ContainerStarted","Data":"3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86"} Oct 12 20:38:13 crc kubenswrapper[4773]: I1012 20:38:13.693480 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4k6v" event={"ID":"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4","Type":"ContainerStarted","Data":"722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df"} Oct 12 20:38:13 crc kubenswrapper[4773]: I1012 20:38:13.705542 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-82kvr" podStartSLOduration=4.224949953 podStartE2EDuration="6.705526123s" podCreationTimestamp="2025-10-12 20:38:07 +0000 UTC" firstStartedPulling="2025-10-12 20:38:10.597227404 +0000 UTC m=+838.833525964" lastFinishedPulling="2025-10-12 20:38:13.077803574 +0000 UTC m=+841.314102134" observedRunningTime="2025-10-12 20:38:13.703146517 +0000 UTC m=+841.939445077" watchObservedRunningTime="2025-10-12 20:38:13.705526123 +0000 UTC m=+841.941824683" Oct 12 20:38:13 crc kubenswrapper[4773]: I1012 20:38:13.729060 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pzqcq" podStartSLOduration=18.148259359 podStartE2EDuration="20.729042957s" podCreationTimestamp="2025-10-12 20:37:53 +0000 UTC" firstStartedPulling="2025-10-12 20:38:10.527886845 +0000 UTC m=+838.764185405" lastFinishedPulling="2025-10-12 20:38:13.108670443 +0000 UTC m=+841.344969003" observedRunningTime="2025-10-12 20:38:13.726212878 +0000 UTC m=+841.962511438" watchObservedRunningTime="2025-10-12 20:38:13.729042957 +0000 UTC m=+841.965341517" Oct 12 20:38:13 crc kubenswrapper[4773]: I1012 20:38:13.740810 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" podStartSLOduration=3.694874163 podStartE2EDuration="34.740795564s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:41.533482981 +0000 UTC m=+809.769781541" lastFinishedPulling="2025-10-12 20:38:12.579404392 +0000 UTC m=+840.815702942" observedRunningTime="2025-10-12 20:38:13.739497538 +0000 UTC m=+841.975796108" watchObservedRunningTime="2025-10-12 20:38:13.740795564 +0000 UTC m=+841.977094124" Oct 12 20:38:13 crc kubenswrapper[4773]: I1012 20:38:13.765698 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4k6v" podStartSLOduration=8.975240418 podStartE2EDuration="11.765678236s" podCreationTimestamp="2025-10-12 20:38:02 +0000 UTC" firstStartedPulling="2025-10-12 20:38:10.515070489 +0000 UTC m=+838.751369049" lastFinishedPulling="2025-10-12 20:38:13.305508307 +0000 UTC m=+841.541806867" observedRunningTime="2025-10-12 20:38:13.760035289 +0000 UTC m=+841.996333839" watchObservedRunningTime="2025-10-12 20:38:13.765678236 +0000 UTC m=+842.001976806" Oct 12 20:38:17 crc kubenswrapper[4773]: I1012 20:38:17.806351 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:17 crc kubenswrapper[4773]: I1012 20:38:17.806741 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:17 crc kubenswrapper[4773]: I1012 20:38:17.850330 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:18 crc kubenswrapper[4773]: I1012 20:38:18.774425 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.526190 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-rqvz2" Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.533861 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-4wwwj" Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.570390 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-sfgw7" Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.617690 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-llrqr" Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.649198 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xnqzt" Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.709280 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-shp22" Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.851440 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-fqqwc" Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.853556 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82kvr"] Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.988148 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-jh557" Oct 12 20:38:19 crc kubenswrapper[4773]: I1012 20:38:19.988658 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-sc6z2" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.050898 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5df598886f-pbmbc" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.186352 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-b6kht" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.316029 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-dtppq" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.411288 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-8j4jx" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.455910 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-nvvg8" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.535161 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-rczp5" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.555620 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-thj2w" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.660985 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-bcdc4" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.731741 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-4dsfj" Oct 12 20:38:20 crc kubenswrapper[4773]: I1012 20:38:20.742165 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-82kvr" podUID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerName="registry-server" containerID="cri-o://ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882" gracePeriod=2 Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.120707 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.121395 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.321996 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-catalog-content\") pod \"c86df0c4-a7dc-4309-9fc1-faac3382346c\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.322051 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4l8\" (UniqueName: \"kubernetes.io/projected/c86df0c4-a7dc-4309-9fc1-faac3382346c-kube-api-access-8k4l8\") pod \"c86df0c4-a7dc-4309-9fc1-faac3382346c\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.322095 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-utilities\") pod \"c86df0c4-a7dc-4309-9fc1-faac3382346c\" (UID: \"c86df0c4-a7dc-4309-9fc1-faac3382346c\") " Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.322989 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-utilities" (OuterVolumeSpecName: "utilities") pod "c86df0c4-a7dc-4309-9fc1-faac3382346c" (UID: "c86df0c4-a7dc-4309-9fc1-faac3382346c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.323566 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.327897 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86df0c4-a7dc-4309-9fc1-faac3382346c-kube-api-access-8k4l8" (OuterVolumeSpecName: "kube-api-access-8k4l8") pod "c86df0c4-a7dc-4309-9fc1-faac3382346c" (UID: "c86df0c4-a7dc-4309-9fc1-faac3382346c"). InnerVolumeSpecName "kube-api-access-8k4l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.334521 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c86df0c4-a7dc-4309-9fc1-faac3382346c" (UID: "c86df0c4-a7dc-4309-9fc1-faac3382346c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.425460 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c86df0c4-a7dc-4309-9fc1-faac3382346c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.425500 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4l8\" (UniqueName: \"kubernetes.io/projected/c86df0c4-a7dc-4309-9fc1-faac3382346c-kube-api-access-8k4l8\") on node \"crc\" DevicePath \"\"" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.752333 4773 generic.go:334] "Generic (PLEG): container finished" podID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerID="ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882" exitCode=0 Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.752381 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82kvr" event={"ID":"c86df0c4-a7dc-4309-9fc1-faac3382346c","Type":"ContainerDied","Data":"ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882"} Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.752418 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82kvr" event={"ID":"c86df0c4-a7dc-4309-9fc1-faac3382346c","Type":"ContainerDied","Data":"0755fa685a06d5d90c746461eec544689337071843e0274d9e49d167169b5394"} Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.752442 4773 scope.go:117] "RemoveContainer" containerID="ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.755157 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82kvr" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.770432 4773 scope.go:117] "RemoveContainer" containerID="64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.797547 4773 scope.go:117] "RemoveContainer" containerID="698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.807708 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82kvr"] Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.816540 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-82kvr"] Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.824745 4773 scope.go:117] "RemoveContainer" containerID="ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882" Oct 12 20:38:21 crc kubenswrapper[4773]: E1012 20:38:21.825165 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882\": container with ID starting with ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882 not found: ID does not exist" containerID="ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.825224 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882"} err="failed to get container status \"ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882\": rpc error: code = NotFound desc = could not find container \"ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882\": container with ID starting with ea1b1bdeeee84c0a64da6920f2360ee2064776a70faf004327ba7a5812998882 not found: ID does not exist" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.825252 4773 scope.go:117] "RemoveContainer" containerID="64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834" Oct 12 20:38:21 crc kubenswrapper[4773]: E1012 20:38:21.825583 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834\": container with ID starting with 64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834 not found: ID does not exist" containerID="64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.825627 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834"} err="failed to get container status \"64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834\": rpc error: code = NotFound desc = could not find container \"64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834\": container with ID starting with 64e1f5a2750a27bf60ad5131d8fce920269e1322b97f787992ad1cc0dcbaa834 not found: ID does not exist" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.825660 4773 scope.go:117] "RemoveContainer" containerID="698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b" Oct 12 20:38:21 crc kubenswrapper[4773]: E1012 20:38:21.825922 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b\": container with ID starting with 698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b not found: ID does not exist" containerID="698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b" Oct 12 20:38:21 crc kubenswrapper[4773]: I1012 20:38:21.825978 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b"} err="failed to get container status \"698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b\": rpc error: code = NotFound desc = could not find container \"698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b\": container with ID starting with 698fd45237d9eafe50701c1a5941488d4e00c850651f34a195a579229e5a441b not found: ID does not exist" Oct 12 20:38:22 crc kubenswrapper[4773]: I1012 20:38:22.489896 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86df0c4-a7dc-4309-9fc1-faac3382346c" path="/var/lib/kubelet/pods/c86df0c4-a7dc-4309-9fc1-faac3382346c/volumes" Oct 12 20:38:22 crc kubenswrapper[4773]: I1012 20:38:22.702632 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:22 crc kubenswrapper[4773]: I1012 20:38:22.702674 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:22 crc kubenswrapper[4773]: I1012 20:38:22.746027 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:22 crc kubenswrapper[4773]: I1012 20:38:22.761670 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" event={"ID":"b2ec8f8f-d841-4683-86ed-54ec360d9ec1","Type":"ContainerStarted","Data":"03c26b8858577290051f2f1713fc6b81d67c53346fba6f6de88e0dbb70db16d5"} Oct 12 20:38:22 crc kubenswrapper[4773]: I1012 20:38:22.762402 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" Oct 12 20:38:22 crc kubenswrapper[4773]: I1012 20:38:22.789652 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" podStartSLOduration=4.003365274 podStartE2EDuration="43.789632565s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.233608753 +0000 UTC m=+810.469907313" lastFinishedPulling="2025-10-12 20:38:22.019876054 +0000 UTC m=+850.256174604" observedRunningTime="2025-10-12 20:38:22.787094285 +0000 UTC m=+851.023392845" watchObservedRunningTime="2025-10-12 20:38:22.789632565 +0000 UTC m=+851.025931135" Oct 12 20:38:22 crc kubenswrapper[4773]: I1012 20:38:22.803449 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:23 crc kubenswrapper[4773]: I1012 20:38:23.490040 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:38:23 crc kubenswrapper[4773]: I1012 20:38:23.491566 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:38:23 crc kubenswrapper[4773]: I1012 20:38:23.551381 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:38:23 crc kubenswrapper[4773]: I1012 20:38:23.819994 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:38:24 crc kubenswrapper[4773]: I1012 20:38:24.855232 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4k6v"] Oct 12 20:38:24 crc kubenswrapper[4773]: I1012 20:38:24.855427 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4k6v" podUID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerName="registry-server" containerID="cri-o://722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df" gracePeriod=2 Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.325219 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.488994 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-catalog-content\") pod \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.489256 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf28c\" (UniqueName: \"kubernetes.io/projected/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-kube-api-access-wf28c\") pod \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.489465 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-utilities\") pod \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\" (UID: \"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4\") " Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.490544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-utilities" (OuterVolumeSpecName: "utilities") pod "a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" (UID: "a0d4e643-b050-4c7c-8a31-3e0a793e1cc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.496619 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-kube-api-access-wf28c" (OuterVolumeSpecName: "kube-api-access-wf28c") pod "a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" (UID: "a0d4e643-b050-4c7c-8a31-3e0a793e1cc4"). InnerVolumeSpecName "kube-api-access-wf28c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.545854 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" (UID: "a0d4e643-b050-4c7c-8a31-3e0a793e1cc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.591624 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf28c\" (UniqueName: \"kubernetes.io/projected/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-kube-api-access-wf28c\") on node \"crc\" DevicePath \"\"" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.591653 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.591663 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.808292 4773 generic.go:334] "Generic (PLEG): container finished" podID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerID="722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df" exitCode=0 Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.808481 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4k6v" event={"ID":"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4","Type":"ContainerDied","Data":"722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df"} Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.808509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4k6v" event={"ID":"a0d4e643-b050-4c7c-8a31-3e0a793e1cc4","Type":"ContainerDied","Data":"0dd591e5cca2a81846e0500184a4fc40ae3e74a13d6a78e4bf416175ae6d21cd"} Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.808527 4773 scope.go:117] "RemoveContainer" containerID="722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.808668 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4k6v" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.842864 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4k6v"] Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.851695 4773 scope.go:117] "RemoveContainer" containerID="ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.851769 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4k6v"] Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.864989 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pzqcq"] Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.865760 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pzqcq" podUID="50e96b5e-1264-4224-917b-dccb3e792b70" containerName="registry-server" containerID="cri-o://3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86" gracePeriod=2 Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.871787 4773 scope.go:117] "RemoveContainer" containerID="638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.895175 4773 scope.go:117] "RemoveContainer" containerID="722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df" Oct 12 20:38:25 crc kubenswrapper[4773]: E1012 20:38:25.895654 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df\": container with ID starting with 722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df not found: ID does not exist" containerID="722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.895692 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df"} err="failed to get container status \"722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df\": rpc error: code = NotFound desc = could not find container \"722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df\": container with ID starting with 722d6abf6564fdc280e6cd45e7fb148214012be303d3a3d2abb2db228c85e4df not found: ID does not exist" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.895736 4773 scope.go:117] "RemoveContainer" containerID="ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707" Oct 12 20:38:25 crc kubenswrapper[4773]: E1012 20:38:25.896168 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707\": container with ID starting with ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707 not found: ID does not exist" containerID="ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.896222 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707"} err="failed to get container status \"ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707\": rpc error: code = NotFound desc = could not find container \"ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707\": container with ID starting with ff05d2516718ed308b53400c6b1f8cdffa07b484ca8687d47cf8753a0ca17707 not found: ID does not exist" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.896251 4773 scope.go:117] "RemoveContainer" containerID="638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3" Oct 12 20:38:25 crc kubenswrapper[4773]: E1012 20:38:25.896609 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3\": container with ID starting with 638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3 not found: ID does not exist" containerID="638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3" Oct 12 20:38:25 crc kubenswrapper[4773]: I1012 20:38:25.896638 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3"} err="failed to get container status \"638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3\": rpc error: code = NotFound desc = could not find container \"638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3\": container with ID starting with 638bca5a54ef7962574c5b275335e5e72c4846cae4fa7da970758f1716612ee3 not found: ID does not exist" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.344351 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.421766 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-catalog-content\") pod \"50e96b5e-1264-4224-917b-dccb3e792b70\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.421914 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p7mv\" (UniqueName: \"kubernetes.io/projected/50e96b5e-1264-4224-917b-dccb3e792b70-kube-api-access-7p7mv\") pod \"50e96b5e-1264-4224-917b-dccb3e792b70\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.421990 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-utilities\") pod \"50e96b5e-1264-4224-917b-dccb3e792b70\" (UID: \"50e96b5e-1264-4224-917b-dccb3e792b70\") " Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.422506 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-utilities" (OuterVolumeSpecName: "utilities") pod "50e96b5e-1264-4224-917b-dccb3e792b70" (UID: "50e96b5e-1264-4224-917b-dccb3e792b70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.422762 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.426764 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e96b5e-1264-4224-917b-dccb3e792b70-kube-api-access-7p7mv" (OuterVolumeSpecName: "kube-api-access-7p7mv") pod "50e96b5e-1264-4224-917b-dccb3e792b70" (UID: "50e96b5e-1264-4224-917b-dccb3e792b70"). InnerVolumeSpecName "kube-api-access-7p7mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.488972 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" path="/var/lib/kubelet/pods/a0d4e643-b050-4c7c-8a31-3e0a793e1cc4/volumes" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.497399 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50e96b5e-1264-4224-917b-dccb3e792b70" (UID: "50e96b5e-1264-4224-917b-dccb3e792b70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.523453 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e96b5e-1264-4224-917b-dccb3e792b70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.523484 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p7mv\" (UniqueName: \"kubernetes.io/projected/50e96b5e-1264-4224-917b-dccb3e792b70-kube-api-access-7p7mv\") on node \"crc\" DevicePath \"\"" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.818585 4773 generic.go:334] "Generic (PLEG): container finished" podID="50e96b5e-1264-4224-917b-dccb3e792b70" containerID="3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86" exitCode=0 Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.818648 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pzqcq" event={"ID":"50e96b5e-1264-4224-917b-dccb3e792b70","Type":"ContainerDied","Data":"3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86"} Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.818678 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pzqcq" event={"ID":"50e96b5e-1264-4224-917b-dccb3e792b70","Type":"ContainerDied","Data":"dfa01ec59db686d436e6b215bf63fa0bcd6df23a360a84e1f48fc84c9ed30dec"} Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.818700 4773 scope.go:117] "RemoveContainer" containerID="3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.818812 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pzqcq" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.827918 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" event={"ID":"e38708a6-e3b7-407d-8fe5-f27cd9a69f76","Type":"ContainerStarted","Data":"d60c74da79446c4f06c555727b33e8de82506670904605a7a2a4d2f771056f85"} Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.828460 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.843335 4773 scope.go:117] "RemoveContainer" containerID="e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.843735 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" podStartSLOduration=4.053443844 podStartE2EDuration="47.843698837s" podCreationTimestamp="2025-10-12 20:37:39 +0000 UTC" firstStartedPulling="2025-10-12 20:37:42.234664963 +0000 UTC m=+810.470963523" lastFinishedPulling="2025-10-12 20:38:26.024919956 +0000 UTC m=+854.261218516" observedRunningTime="2025-10-12 20:38:26.8412435 +0000 UTC m=+855.077542060" watchObservedRunningTime="2025-10-12 20:38:26.843698837 +0000 UTC m=+855.079997387" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.858870 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pzqcq"] Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.864372 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pzqcq"] Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.874015 4773 scope.go:117] "RemoveContainer" containerID="bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.889287 4773 scope.go:117] "RemoveContainer" containerID="3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86" Oct 12 20:38:26 crc kubenswrapper[4773]: E1012 20:38:26.889645 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86\": container with ID starting with 3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86 not found: ID does not exist" containerID="3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.889681 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86"} err="failed to get container status \"3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86\": rpc error: code = NotFound desc = could not find container \"3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86\": container with ID starting with 3cfe3477df6cbc91ae835bcf0688391b539ef8405622210fd358398682b7bd86 not found: ID does not exist" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.889712 4773 scope.go:117] "RemoveContainer" containerID="e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a" Oct 12 20:38:26 crc kubenswrapper[4773]: E1012 20:38:26.890000 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a\": container with ID starting with e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a not found: ID does not exist" containerID="e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.890038 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a"} err="failed to get container status \"e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a\": rpc error: code = NotFound desc = could not find container \"e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a\": container with ID starting with e40fb8b71215104c895e7ce8d3b408521f152e47e66252b380e0e0f71cb92b8a not found: ID does not exist" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.890065 4773 scope.go:117] "RemoveContainer" containerID="bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f" Oct 12 20:38:26 crc kubenswrapper[4773]: E1012 20:38:26.890541 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f\": container with ID starting with bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f not found: ID does not exist" containerID="bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f" Oct 12 20:38:26 crc kubenswrapper[4773]: I1012 20:38:26.890582 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f"} err="failed to get container status \"bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f\": rpc error: code = NotFound desc = could not find container \"bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f\": container with ID starting with bae5fd2a051087ab6a23f50dd8375cac25f0aba1c97c506958032a23ac4fb65f not found: ID does not exist" Oct 12 20:38:28 crc kubenswrapper[4773]: I1012 20:38:28.492247 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e96b5e-1264-4224-917b-dccb3e792b70" path="/var/lib/kubelet/pods/50e96b5e-1264-4224-917b-dccb3e792b70/volumes" Oct 12 20:38:30 crc kubenswrapper[4773]: I1012 20:38:30.613412 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vx9cr" Oct 12 20:38:40 crc kubenswrapper[4773]: I1012 20:38:40.678671 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5458f77c4-hz6mw" Oct 12 20:38:58 crc kubenswrapper[4773]: I1012 20:38:58.669740 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:38:58 crc kubenswrapper[4773]: I1012 20:38:58.670337 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.727261 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-gsg29"] Oct 12 20:39:01 crc kubenswrapper[4773]: E1012 20:39:01.728279 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e96b5e-1264-4224-917b-dccb3e792b70" containerName="extract-content" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728296 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e96b5e-1264-4224-917b-dccb3e792b70" containerName="extract-content" Oct 12 20:39:01 crc kubenswrapper[4773]: E1012 20:39:01.728319 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerName="registry-server" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728324 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerName="registry-server" Oct 12 20:39:01 crc kubenswrapper[4773]: E1012 20:39:01.728340 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e96b5e-1264-4224-917b-dccb3e792b70" containerName="extract-utilities" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728346 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e96b5e-1264-4224-917b-dccb3e792b70" containerName="extract-utilities" Oct 12 20:39:01 crc kubenswrapper[4773]: E1012 20:39:01.728356 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerName="extract-utilities" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728362 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerName="extract-utilities" Oct 12 20:39:01 crc kubenswrapper[4773]: E1012 20:39:01.728377 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerName="extract-content" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728383 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerName="extract-content" Oct 12 20:39:01 crc kubenswrapper[4773]: E1012 20:39:01.728391 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerName="registry-server" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728397 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerName="registry-server" Oct 12 20:39:01 crc kubenswrapper[4773]: E1012 20:39:01.728405 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerName="extract-content" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728411 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerName="extract-content" Oct 12 20:39:01 crc kubenswrapper[4773]: E1012 20:39:01.728420 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e96b5e-1264-4224-917b-dccb3e792b70" containerName="registry-server" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728425 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e96b5e-1264-4224-917b-dccb3e792b70" containerName="registry-server" Oct 12 20:39:01 crc kubenswrapper[4773]: E1012 20:39:01.728437 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerName="extract-utilities" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728442 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerName="extract-utilities" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728586 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d4e643-b050-4c7c-8a31-3e0a793e1cc4" containerName="registry-server" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728600 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e96b5e-1264-4224-917b-dccb3e792b70" containerName="registry-server" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.728609 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86df0c4-a7dc-4309-9fc1-faac3382346c" containerName="registry-server" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.729329 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.745564 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.745984 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.746291 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.746499 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jvkn2" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.767191 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-gsg29"] Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.837988 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-d24rk"] Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.839183 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.841639 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.853181 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-d24rk"] Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.917677 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7b6\" (UniqueName: \"kubernetes.io/projected/d7e20e92-b81b-467d-9208-e903575232da-kube-api-access-ht7b6\") pod \"dnsmasq-dns-7bfcb9d745-gsg29\" (UID: \"d7e20e92-b81b-467d-9208-e903575232da\") " pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:01 crc kubenswrapper[4773]: I1012 20:39:01.917789 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e20e92-b81b-467d-9208-e903575232da-config\") pod \"dnsmasq-dns-7bfcb9d745-gsg29\" (UID: \"d7e20e92-b81b-467d-9208-e903575232da\") " pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.018988 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7b6\" (UniqueName: \"kubernetes.io/projected/d7e20e92-b81b-467d-9208-e903575232da-kube-api-access-ht7b6\") pod \"dnsmasq-dns-7bfcb9d745-gsg29\" (UID: \"d7e20e92-b81b-467d-9208-e903575232da\") " pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.019047 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-config\") pod \"dnsmasq-dns-758b79db4c-d24rk\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.019081 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e20e92-b81b-467d-9208-e903575232da-config\") pod \"dnsmasq-dns-7bfcb9d745-gsg29\" (UID: \"d7e20e92-b81b-467d-9208-e903575232da\") " pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.019116 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-dns-svc\") pod \"dnsmasq-dns-758b79db4c-d24rk\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.019144 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pglw6\" (UniqueName: \"kubernetes.io/projected/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-kube-api-access-pglw6\") pod \"dnsmasq-dns-758b79db4c-d24rk\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.020151 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e20e92-b81b-467d-9208-e903575232da-config\") pod \"dnsmasq-dns-7bfcb9d745-gsg29\" (UID: \"d7e20e92-b81b-467d-9208-e903575232da\") " pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.042913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7b6\" (UniqueName: \"kubernetes.io/projected/d7e20e92-b81b-467d-9208-e903575232da-kube-api-access-ht7b6\") pod \"dnsmasq-dns-7bfcb9d745-gsg29\" (UID: \"d7e20e92-b81b-467d-9208-e903575232da\") " pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.045603 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.120098 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-config\") pod \"dnsmasq-dns-758b79db4c-d24rk\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.120369 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-dns-svc\") pod \"dnsmasq-dns-758b79db4c-d24rk\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.120404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pglw6\" (UniqueName: \"kubernetes.io/projected/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-kube-api-access-pglw6\") pod \"dnsmasq-dns-758b79db4c-d24rk\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.120930 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-config\") pod \"dnsmasq-dns-758b79db4c-d24rk\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.121479 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-dns-svc\") pod \"dnsmasq-dns-758b79db4c-d24rk\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.147687 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pglw6\" (UniqueName: \"kubernetes.io/projected/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-kube-api-access-pglw6\") pod \"dnsmasq-dns-758b79db4c-d24rk\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.162509 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.519938 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-gsg29"] Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.528857 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 20:39:02 crc kubenswrapper[4773]: I1012 20:39:02.632108 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-d24rk"] Oct 12 20:39:02 crc kubenswrapper[4773]: W1012 20:39:02.640875 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f0bc2f7_41a4_4b5e_9dcd_242ceb2826a4.slice/crio-2c52542a46521e10569583caac72cf600d7708c5dbf29a07a250356b037de7a1 WatchSource:0}: Error finding container 2c52542a46521e10569583caac72cf600d7708c5dbf29a07a250356b037de7a1: Status 404 returned error can't find the container with id 2c52542a46521e10569583caac72cf600d7708c5dbf29a07a250356b037de7a1 Oct 12 20:39:03 crc kubenswrapper[4773]: I1012 20:39:03.105918 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" event={"ID":"d7e20e92-b81b-467d-9208-e903575232da","Type":"ContainerStarted","Data":"0a2a8ae8aa24b4194c5dd727abb5b04851348921587976c8d802a88df46bea2f"} Oct 12 20:39:03 crc kubenswrapper[4773]: I1012 20:39:03.107764 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-d24rk" event={"ID":"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4","Type":"ContainerStarted","Data":"2c52542a46521e10569583caac72cf600d7708c5dbf29a07a250356b037de7a1"} Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.822887 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-d24rk"] Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.849747 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-644597f84c-hfxh9"] Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.851103 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.867626 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-hfxh9"] Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.873853 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-dns-svc\") pod \"dnsmasq-dns-644597f84c-hfxh9\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.873903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-config\") pod \"dnsmasq-dns-644597f84c-hfxh9\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.873948 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmfbg\" (UniqueName: \"kubernetes.io/projected/9c48d718-b6a7-4625-9236-9dc52f24e3a3-kube-api-access-tmfbg\") pod \"dnsmasq-dns-644597f84c-hfxh9\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.975453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-dns-svc\") pod \"dnsmasq-dns-644597f84c-hfxh9\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.975521 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-config\") pod \"dnsmasq-dns-644597f84c-hfxh9\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.975561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmfbg\" (UniqueName: \"kubernetes.io/projected/9c48d718-b6a7-4625-9236-9dc52f24e3a3-kube-api-access-tmfbg\") pod \"dnsmasq-dns-644597f84c-hfxh9\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.977079 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-dns-svc\") pod \"dnsmasq-dns-644597f84c-hfxh9\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:04 crc kubenswrapper[4773]: I1012 20:39:04.977670 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-config\") pod \"dnsmasq-dns-644597f84c-hfxh9\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.025351 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmfbg\" (UniqueName: \"kubernetes.io/projected/9c48d718-b6a7-4625-9236-9dc52f24e3a3-kube-api-access-tmfbg\") pod \"dnsmasq-dns-644597f84c-hfxh9\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.179817 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-gsg29"] Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.190613 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.212947 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77597f887-tf5bv"] Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.214295 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.234547 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-tf5bv"] Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.384475 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-dns-svc\") pod \"dnsmasq-dns-77597f887-tf5bv\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.384542 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pwj4\" (UniqueName: \"kubernetes.io/projected/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-kube-api-access-5pwj4\") pod \"dnsmasq-dns-77597f887-tf5bv\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.384571 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-config\") pod \"dnsmasq-dns-77597f887-tf5bv\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.507654 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-dns-svc\") pod \"dnsmasq-dns-77597f887-tf5bv\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.515010 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pwj4\" (UniqueName: \"kubernetes.io/projected/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-kube-api-access-5pwj4\") pod \"dnsmasq-dns-77597f887-tf5bv\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.515066 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-config\") pod \"dnsmasq-dns-77597f887-tf5bv\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.516467 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-dns-svc\") pod \"dnsmasq-dns-77597f887-tf5bv\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.517344 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-config\") pod \"dnsmasq-dns-77597f887-tf5bv\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.557265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pwj4\" (UniqueName: \"kubernetes.io/projected/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-kube-api-access-5pwj4\") pod \"dnsmasq-dns-77597f887-tf5bv\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.586029 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:05 crc kubenswrapper[4773]: I1012 20:39:05.748786 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-hfxh9"] Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.028173 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.029335 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.032635 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.035107 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.042116 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.042375 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-crqm6" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.042540 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.042690 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.043184 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.043852 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.142576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.142963 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.143012 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.143034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.143049 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.143069 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.143107 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.143122 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.143145 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc8ll\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-kube-api-access-xc8ll\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.143179 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.143195 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.157778 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-hfxh9" event={"ID":"9c48d718-b6a7-4625-9236-9dc52f24e3a3","Type":"ContainerStarted","Data":"66e456125a078db4bf713629371378eac31ec50b92d24f98be92f6c6b2a93c43"} Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.185392 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-tf5bv"] Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244530 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244626 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244648 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244672 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244688 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244743 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244758 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244783 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc8ll\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-kube-api-access-xc8ll\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244818 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.244831 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.246048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.246063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.246295 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.247644 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.247741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.247841 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.251601 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.256016 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.258587 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.260665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.263295 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc8ll\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-kube-api-access-xc8ll\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.273059 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.354935 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.356503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.359673 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.360426 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.360608 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.360621 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.361071 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.363203 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.365586 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-x5n8n" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.365773 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.366795 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.551469 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.551747 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.551765 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.552009 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcr9t\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-kube-api-access-wcr9t\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.552033 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.552066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.552100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.552124 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.552150 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.552166 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.552191 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657327 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657353 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657370 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657538 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcr9t\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-kube-api-access-wcr9t\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657583 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657659 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657681 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657706 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.657734 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.660438 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.664802 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.664996 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.669645 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.670931 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.671203 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.671520 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.672049 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.677570 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.677858 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.697209 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcr9t\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-kube-api-access-wcr9t\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.702099 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.728696 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 20:39:06 crc kubenswrapper[4773]: I1012 20:39:06.992954 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.166107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-tf5bv" event={"ID":"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4","Type":"ContainerStarted","Data":"129041033e247a7f108b5f44560d823744a294326e7a8cf1da62da381efdeb17"} Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.182601 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3","Type":"ContainerStarted","Data":"421148360c5536826f42d7d029b7c9e4756c7362409615daf033eaff55530534"} Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.697888 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.716090 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.733306 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.733402 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.747656 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ff6m6" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.747854 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.747971 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.748093 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.748205 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.775774 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.891573 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb52152-8a83-4122-be94-0d2803fd5cc7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.891632 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb52152-8a83-4122-be94-0d2803fd5cc7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.891650 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fdr\" (UniqueName: \"kubernetes.io/projected/0cb52152-8a83-4122-be94-0d2803fd5cc7-kube-api-access-b8fdr\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.891669 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0cb52152-8a83-4122-be94-0d2803fd5cc7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.891686 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0cb52152-8a83-4122-be94-0d2803fd5cc7-config-data-default\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.891740 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0cb52152-8a83-4122-be94-0d2803fd5cc7-kolla-config\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.891758 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0cb52152-8a83-4122-be94-0d2803fd5cc7-secrets\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.891798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb52152-8a83-4122-be94-0d2803fd5cc7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.891819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.993397 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb52152-8a83-4122-be94-0d2803fd5cc7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.993462 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb52152-8a83-4122-be94-0d2803fd5cc7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.993480 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fdr\" (UniqueName: \"kubernetes.io/projected/0cb52152-8a83-4122-be94-0d2803fd5cc7-kube-api-access-b8fdr\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.993500 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0cb52152-8a83-4122-be94-0d2803fd5cc7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.993514 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0cb52152-8a83-4122-be94-0d2803fd5cc7-config-data-default\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.993540 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0cb52152-8a83-4122-be94-0d2803fd5cc7-kolla-config\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.993558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0cb52152-8a83-4122-be94-0d2803fd5cc7-secrets\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.993596 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb52152-8a83-4122-be94-0d2803fd5cc7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.993615 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.994570 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0cb52152-8a83-4122-be94-0d2803fd5cc7-kolla-config\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:07 crc kubenswrapper[4773]: I1012 20:39:07.994590 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0cb52152-8a83-4122-be94-0d2803fd5cc7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.003395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0cb52152-8a83-4122-be94-0d2803fd5cc7-config-data-default\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.003708 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.011606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb52152-8a83-4122-be94-0d2803fd5cc7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.032984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0cb52152-8a83-4122-be94-0d2803fd5cc7-secrets\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.049010 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fdr\" (UniqueName: \"kubernetes.io/projected/0cb52152-8a83-4122-be94-0d2803fd5cc7-kube-api-access-b8fdr\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.061257 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.090948 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb52152-8a83-4122-be94-0d2803fd5cc7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.098057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb52152-8a83-4122-be94-0d2803fd5cc7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0cb52152-8a83-4122-be94-0d2803fd5cc7\") " pod="openstack/openstack-galera-0" Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.226646 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36de8afd-4afa-44e6-9d8e-a6c8de0d4707","Type":"ContainerStarted","Data":"e75f2c899b3685bdbff3909de10f8480e11609fecef89b813dc645936daa8835"} Oct 12 20:39:08 crc kubenswrapper[4773]: I1012 20:39:08.389029 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.020497 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.024104 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.029300 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.029532 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.029655 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-fvkg7" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.029824 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.030666 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.121921 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.121966 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.121988 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xgdc\" (UniqueName: \"kubernetes.io/projected/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-kube-api-access-4xgdc\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.122014 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.122041 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.122066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.122091 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.122116 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.122150 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.151035 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224056 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224139 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224189 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224221 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224242 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xgdc\" (UniqueName: \"kubernetes.io/projected/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-kube-api-access-4xgdc\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224273 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224342 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224377 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.224415 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.226221 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.226272 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.226649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.227385 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.233227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.233388 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.248668 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.254355 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xgdc\" (UniqueName: \"kubernetes.io/projected/7566f11c-8e52-4fb6-b1a2-98b388ffefd9-kube-api-access-4xgdc\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.291240 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0cb52152-8a83-4122-be94-0d2803fd5cc7","Type":"ContainerStarted","Data":"54a8768cb6c8871e9c7051dae6bf717059581b819b62aa6498e5d0f4a9f0ecb5"} Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.304378 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7566f11c-8e52-4fb6-b1a2-98b388ffefd9\") " pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.375932 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.386374 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.388349 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.391085 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.391222 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.391337 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-ps9f7" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.400402 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.536411 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8aeb19a9-db56-488b-9410-004f24e8d11a-kolla-config\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.537774 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aeb19a9-db56-488b-9410-004f24e8d11a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.537946 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrmnl\" (UniqueName: \"kubernetes.io/projected/8aeb19a9-db56-488b-9410-004f24e8d11a-kube-api-access-xrmnl\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.538107 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aeb19a9-db56-488b-9410-004f24e8d11a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.538604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aeb19a9-db56-488b-9410-004f24e8d11a-config-data\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.640853 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aeb19a9-db56-488b-9410-004f24e8d11a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.641219 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aeb19a9-db56-488b-9410-004f24e8d11a-config-data\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.641261 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8aeb19a9-db56-488b-9410-004f24e8d11a-kolla-config\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.641286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aeb19a9-db56-488b-9410-004f24e8d11a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.641335 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrmnl\" (UniqueName: \"kubernetes.io/projected/8aeb19a9-db56-488b-9410-004f24e8d11a-kube-api-access-xrmnl\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.642424 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8aeb19a9-db56-488b-9410-004f24e8d11a-kolla-config\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.648910 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aeb19a9-db56-488b-9410-004f24e8d11a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.649664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aeb19a9-db56-488b-9410-004f24e8d11a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.649871 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8aeb19a9-db56-488b-9410-004f24e8d11a-config-data\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.672172 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrmnl\" (UniqueName: \"kubernetes.io/projected/8aeb19a9-db56-488b-9410-004f24e8d11a-kube-api-access-xrmnl\") pod \"memcached-0\" (UID: \"8aeb19a9-db56-488b-9410-004f24e8d11a\") " pod="openstack/memcached-0" Oct 12 20:39:09 crc kubenswrapper[4773]: I1012 20:39:09.705420 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 12 20:39:10 crc kubenswrapper[4773]: I1012 20:39:10.036587 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 12 20:39:11 crc kubenswrapper[4773]: I1012 20:39:11.241439 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 20:39:11 crc kubenswrapper[4773]: I1012 20:39:11.243366 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 20:39:11 crc kubenswrapper[4773]: I1012 20:39:11.245832 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jrvmq" Oct 12 20:39:11 crc kubenswrapper[4773]: I1012 20:39:11.273027 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 20:39:11 crc kubenswrapper[4773]: I1012 20:39:11.375413 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hd9h\" (UniqueName: \"kubernetes.io/projected/3d9d2321-06c1-42e1-b4a9-d6a759d0eba0-kube-api-access-7hd9h\") pod \"kube-state-metrics-0\" (UID: \"3d9d2321-06c1-42e1-b4a9-d6a759d0eba0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:39:11 crc kubenswrapper[4773]: I1012 20:39:11.491976 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hd9h\" (UniqueName: \"kubernetes.io/projected/3d9d2321-06c1-42e1-b4a9-d6a759d0eba0-kube-api-access-7hd9h\") pod \"kube-state-metrics-0\" (UID: \"3d9d2321-06c1-42e1-b4a9-d6a759d0eba0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:39:11 crc kubenswrapper[4773]: I1012 20:39:11.562980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hd9h\" (UniqueName: \"kubernetes.io/projected/3d9d2321-06c1-42e1-b4a9-d6a759d0eba0-kube-api-access-7hd9h\") pod \"kube-state-metrics-0\" (UID: \"3d9d2321-06c1-42e1-b4a9-d6a759d0eba0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:39:11 crc kubenswrapper[4773]: I1012 20:39:11.592104 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.576981 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sf74r"] Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.578315 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.582085 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.582356 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-4wgj6" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.582521 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.586576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-var-run-ovn\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.586608 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-scripts\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.586646 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-combined-ca-bundle\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.586663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-var-log-ovn\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.586682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm97c\" (UniqueName: \"kubernetes.io/projected/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-kube-api-access-pm97c\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.586755 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-ovn-controller-tls-certs\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.586775 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-var-run\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.588153 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-wfpwq"] Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.590842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.626405 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wfpwq"] Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.664292 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sf74r"] Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.687683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-ovn-controller-tls-certs\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.687750 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-var-run\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.687802 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-var-run-ovn\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.687828 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-scripts\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.687865 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-combined-ca-bundle\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.687882 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-var-log-ovn\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.687906 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm97c\" (UniqueName: \"kubernetes.io/projected/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-kube-api-access-pm97c\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.693496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-var-log-ovn\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.693731 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-var-run\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.693852 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-scripts\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.695073 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-var-run-ovn\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.698154 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-ovn-controller-tls-certs\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.698454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-combined-ca-bundle\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.717994 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm97c\" (UniqueName: \"kubernetes.io/projected/1a08bcbe-fa8c-43b2-a4fb-ae2212de940d-kube-api-access-pm97c\") pod \"ovn-controller-sf74r\" (UID: \"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d\") " pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.789609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-var-run\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.789671 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg5cx\" (UniqueName: \"kubernetes.io/projected/55ef70a8-016d-403f-ab02-820088160f9c-kube-api-access-wg5cx\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.789705 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-var-log\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.789757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-var-lib\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.789819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-etc-ovs\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.789841 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ef70a8-016d-403f-ab02-820088160f9c-scripts\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.894090 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sf74r" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.898529 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-etc-ovs\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.898566 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ef70a8-016d-403f-ab02-820088160f9c-scripts\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.898603 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-var-run\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.898631 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg5cx\" (UniqueName: \"kubernetes.io/projected/55ef70a8-016d-403f-ab02-820088160f9c-kube-api-access-wg5cx\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.898657 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-var-log\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.898687 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-var-lib\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.898847 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-var-run\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.898972 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-var-lib\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.898976 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-etc-ovs\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.899048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/55ef70a8-016d-403f-ab02-820088160f9c-var-log\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.900518 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ef70a8-016d-403f-ab02-820088160f9c-scripts\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:15 crc kubenswrapper[4773]: I1012 20:39:15.924593 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg5cx\" (UniqueName: \"kubernetes.io/projected/55ef70a8-016d-403f-ab02-820088160f9c-kube-api-access-wg5cx\") pod \"ovn-controller-ovs-wfpwq\" (UID: \"55ef70a8-016d-403f-ab02-820088160f9c\") " pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:16 crc kubenswrapper[4773]: W1012 20:39:16.108870 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7566f11c_8e52_4fb6_b1a2_98b388ffefd9.slice/crio-78d39fa3c17a155ca2b0621790225f72fbb90ab83abb8b3ffd15b7e96df72f71 WatchSource:0}: Error finding container 78d39fa3c17a155ca2b0621790225f72fbb90ab83abb8b3ffd15b7e96df72f71: Status 404 returned error can't find the container with id 78d39fa3c17a155ca2b0621790225f72fbb90ab83abb8b3ffd15b7e96df72f71 Oct 12 20:39:16 crc kubenswrapper[4773]: I1012 20:39:16.204215 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:16 crc kubenswrapper[4773]: I1012 20:39:16.381940 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7566f11c-8e52-4fb6-b1a2-98b388ffefd9","Type":"ContainerStarted","Data":"78d39fa3c17a155ca2b0621790225f72fbb90ab83abb8b3ffd15b7e96df72f71"} Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.457396 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.461383 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.465465 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-cslhq" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.466015 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.466273 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.466474 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.466695 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.467150 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.527922 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21606d72-32b0-4552-ac26-df0425f03cdf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.527963 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pp6n\" (UniqueName: \"kubernetes.io/projected/21606d72-32b0-4552-ac26-df0425f03cdf-kube-api-access-8pp6n\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.528061 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21606d72-32b0-4552-ac26-df0425f03cdf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.528140 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/21606d72-32b0-4552-ac26-df0425f03cdf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.528198 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/21606d72-32b0-4552-ac26-df0425f03cdf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.528309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21606d72-32b0-4552-ac26-df0425f03cdf-config\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.528451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21606d72-32b0-4552-ac26-df0425f03cdf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.528630 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.630188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21606d72-32b0-4552-ac26-df0425f03cdf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.630250 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/21606d72-32b0-4552-ac26-df0425f03cdf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.630282 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/21606d72-32b0-4552-ac26-df0425f03cdf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.630302 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21606d72-32b0-4552-ac26-df0425f03cdf-config\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.630338 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21606d72-32b0-4552-ac26-df0425f03cdf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.631352 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.631449 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/21606d72-32b0-4552-ac26-df0425f03cdf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.631560 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pp6n\" (UniqueName: \"kubernetes.io/projected/21606d72-32b0-4552-ac26-df0425f03cdf-kube-api-access-8pp6n\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.631580 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21606d72-32b0-4552-ac26-df0425f03cdf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.631737 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21606d72-32b0-4552-ac26-df0425f03cdf-config\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.631980 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.636352 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21606d72-32b0-4552-ac26-df0425f03cdf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.638234 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21606d72-32b0-4552-ac26-df0425f03cdf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.650294 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pp6n\" (UniqueName: \"kubernetes.io/projected/21606d72-32b0-4552-ac26-df0425f03cdf-kube-api-access-8pp6n\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.650327 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21606d72-32b0-4552-ac26-df0425f03cdf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.650937 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/21606d72-32b0-4552-ac26-df0425f03cdf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.669188 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"21606d72-32b0-4552-ac26-df0425f03cdf\") " pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:17 crc kubenswrapper[4773]: I1012 20:39:17.803154 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.385154 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.386622 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.391173 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.408959 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.409124 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.409241 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.409268 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wr6bq" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.548163 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5xfh\" (UniqueName: \"kubernetes.io/projected/dc85890c-cde2-470e-87de-4d69f1682bd0-kube-api-access-n5xfh\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.548214 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc85890c-cde2-470e-87de-4d69f1682bd0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.548340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc85890c-cde2-470e-87de-4d69f1682bd0-config\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.548505 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.548533 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc85890c-cde2-470e-87de-4d69f1682bd0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.548558 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc85890c-cde2-470e-87de-4d69f1682bd0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.548597 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc85890c-cde2-470e-87de-4d69f1682bd0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.548614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc85890c-cde2-470e-87de-4d69f1682bd0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.650418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.650462 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc85890c-cde2-470e-87de-4d69f1682bd0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.650479 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc85890c-cde2-470e-87de-4d69f1682bd0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.650502 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc85890c-cde2-470e-87de-4d69f1682bd0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.650517 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc85890c-cde2-470e-87de-4d69f1682bd0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.650575 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5xfh\" (UniqueName: \"kubernetes.io/projected/dc85890c-cde2-470e-87de-4d69f1682bd0-kube-api-access-n5xfh\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.650593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc85890c-cde2-470e-87de-4d69f1682bd0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.650620 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc85890c-cde2-470e-87de-4d69f1682bd0-config\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.650644 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.651853 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc85890c-cde2-470e-87de-4d69f1682bd0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.652043 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc85890c-cde2-470e-87de-4d69f1682bd0-config\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.655048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc85890c-cde2-470e-87de-4d69f1682bd0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.657831 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc85890c-cde2-470e-87de-4d69f1682bd0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.660101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc85890c-cde2-470e-87de-4d69f1682bd0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.666886 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5xfh\" (UniqueName: \"kubernetes.io/projected/dc85890c-cde2-470e-87de-4d69f1682bd0-kube-api-access-n5xfh\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.678804 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc85890c-cde2-470e-87de-4d69f1682bd0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.688182 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dc85890c-cde2-470e-87de-4d69f1682bd0\") " pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:18 crc kubenswrapper[4773]: I1012 20:39:18.735504 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:28 crc kubenswrapper[4773]: I1012 20:39:28.669860 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:39:28 crc kubenswrapper[4773]: I1012 20:39:28.670557 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.437663 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.438253 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pglw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-758b79db4c-d24rk_openstack(0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.439597 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-758b79db4c-d24rk" podUID="0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.459979 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.460142 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmfbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-644597f84c-hfxh9_openstack(9c48d718-b6a7-4625-9236-9dc52f24e3a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.461629 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-644597f84c-hfxh9" podUID="9c48d718-b6a7-4625-9236-9dc52f24e3a3" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.475374 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.475558 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ht7b6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bfcb9d745-gsg29_openstack(d7e20e92-b81b-467d-9208-e903575232da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.476737 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" podUID="d7e20e92-b81b-467d-9208-e903575232da" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.478447 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.478616 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5pwj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-77597f887-tf5bv_openstack(37254faf-7d6e-4fe7-a4b5-4198c4cdafa4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.480913 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-77597f887-tf5bv" podUID="37254faf-7d6e-4fe7-a4b5-4198c4cdafa4" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.539388 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df\\\"\"" pod="openstack/dnsmasq-dns-644597f84c-hfxh9" podUID="9c48d718-b6a7-4625-9236-9dc52f24e3a3" Oct 12 20:39:30 crc kubenswrapper[4773]: E1012 20:39:30.545737 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df\\\"\"" pod="openstack/dnsmasq-dns-77597f887-tf5bv" podUID="37254faf-7d6e-4fe7-a4b5-4198c4cdafa4" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.015303 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.139451 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.163009 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.277204 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e20e92-b81b-467d-9208-e903575232da-config\") pod \"d7e20e92-b81b-467d-9208-e903575232da\" (UID: \"d7e20e92-b81b-467d-9208-e903575232da\") " Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.277307 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-dns-svc\") pod \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.277345 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pglw6\" (UniqueName: \"kubernetes.io/projected/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-kube-api-access-pglw6\") pod \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.277424 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht7b6\" (UniqueName: \"kubernetes.io/projected/d7e20e92-b81b-467d-9208-e903575232da-kube-api-access-ht7b6\") pod \"d7e20e92-b81b-467d-9208-e903575232da\" (UID: \"d7e20e92-b81b-467d-9208-e903575232da\") " Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.277482 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-config\") pod \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\" (UID: \"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4\") " Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.277627 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e20e92-b81b-467d-9208-e903575232da-config" (OuterVolumeSpecName: "config") pod "d7e20e92-b81b-467d-9208-e903575232da" (UID: "d7e20e92-b81b-467d-9208-e903575232da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.277985 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e20e92-b81b-467d-9208-e903575232da-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.278370 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4" (UID: "0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.278431 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-config" (OuterVolumeSpecName: "config") pod "0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4" (UID: "0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.283823 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-kube-api-access-pglw6" (OuterVolumeSpecName: "kube-api-access-pglw6") pod "0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4" (UID: "0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4"). InnerVolumeSpecName "kube-api-access-pglw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.283909 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e20e92-b81b-467d-9208-e903575232da-kube-api-access-ht7b6" (OuterVolumeSpecName: "kube-api-access-ht7b6") pod "d7e20e92-b81b-467d-9208-e903575232da" (UID: "d7e20e92-b81b-467d-9208-e903575232da"). InnerVolumeSpecName "kube-api-access-ht7b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.317109 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 20:39:31 crc kubenswrapper[4773]: W1012 20:39:31.326416 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d9d2321_06c1_42e1_b4a9_d6a759d0eba0.slice/crio-5a759c1a95e42252828f3e19421e3b56cc70ebf51f63f3740c67b1bddfc4d83e WatchSource:0}: Error finding container 5a759c1a95e42252828f3e19421e3b56cc70ebf51f63f3740c67b1bddfc4d83e: Status 404 returned error can't find the container with id 5a759c1a95e42252828f3e19421e3b56cc70ebf51f63f3740c67b1bddfc4d83e Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.350290 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sf74r"] Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.379920 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.379946 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.379956 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pglw6\" (UniqueName: \"kubernetes.io/projected/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4-kube-api-access-pglw6\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.379966 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht7b6\" (UniqueName: \"kubernetes.io/projected/d7e20e92-b81b-467d-9208-e903575232da-kube-api-access-ht7b6\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.539115 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8aeb19a9-db56-488b-9410-004f24e8d11a","Type":"ContainerStarted","Data":"2d1a76a3ba2ab69e3e297610ca8c778069976d197587707445eeff616ba0fc0f"} Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.540080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d9d2321-06c1-42e1-b4a9-d6a759d0eba0","Type":"ContainerStarted","Data":"5a759c1a95e42252828f3e19421e3b56cc70ebf51f63f3740c67b1bddfc4d83e"} Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.541104 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" event={"ID":"d7e20e92-b81b-467d-9208-e903575232da","Type":"ContainerDied","Data":"0a2a8ae8aa24b4194c5dd727abb5b04851348921587976c8d802a88df46bea2f"} Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.541177 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-gsg29" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.544469 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7566f11c-8e52-4fb6-b1a2-98b388ffefd9","Type":"ContainerStarted","Data":"20f2a12800ca877679321089f5d22f23e78e32b0cac5eb5981dc15783c50f650"} Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.545863 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sf74r" event={"ID":"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d","Type":"ContainerStarted","Data":"0acb1013c846f3806fac2559e380a02955b61de13ba5a9b033823a92bcc3553f"} Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.546699 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-d24rk" event={"ID":"0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4","Type":"ContainerDied","Data":"2c52542a46521e10569583caac72cf600d7708c5dbf29a07a250356b037de7a1"} Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.546771 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-d24rk" Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.550909 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0cb52152-8a83-4122-be94-0d2803fd5cc7","Type":"ContainerStarted","Data":"f7733a83362f6733ad231dbbc4d0aeace656f3b1c19ce9f58608933cf5f73e0b"} Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.605439 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-gsg29"] Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.623856 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-gsg29"] Oct 12 20:39:31 crc kubenswrapper[4773]: W1012 20:39:31.675437 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc85890c_cde2_470e_87de_4d69f1682bd0.slice/crio-39f8963cef00bbc5762ec199f7dd580d01c10c309943d500db061e93961ac122 WatchSource:0}: Error finding container 39f8963cef00bbc5762ec199f7dd580d01c10c309943d500db061e93961ac122: Status 404 returned error can't find the container with id 39f8963cef00bbc5762ec199f7dd580d01c10c309943d500db061e93961ac122 Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.701097 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.721528 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-d24rk"] Oct 12 20:39:31 crc kubenswrapper[4773]: I1012 20:39:31.735067 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-d24rk"] Oct 12 20:39:32 crc kubenswrapper[4773]: I1012 20:39:32.200866 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wfpwq"] Oct 12 20:39:32 crc kubenswrapper[4773]: I1012 20:39:32.427506 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 12 20:39:32 crc kubenswrapper[4773]: I1012 20:39:32.492791 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4" path="/var/lib/kubelet/pods/0f0bc2f7-41a4-4b5e-9dcd-242ceb2826a4/volumes" Oct 12 20:39:32 crc kubenswrapper[4773]: I1012 20:39:32.493383 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e20e92-b81b-467d-9208-e903575232da" path="/var/lib/kubelet/pods/d7e20e92-b81b-467d-9208-e903575232da/volumes" Oct 12 20:39:32 crc kubenswrapper[4773]: I1012 20:39:32.561909 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dc85890c-cde2-470e-87de-4d69f1682bd0","Type":"ContainerStarted","Data":"39f8963cef00bbc5762ec199f7dd580d01c10c309943d500db061e93961ac122"} Oct 12 20:39:32 crc kubenswrapper[4773]: I1012 20:39:32.568702 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wfpwq" event={"ID":"55ef70a8-016d-403f-ab02-820088160f9c","Type":"ContainerStarted","Data":"02692c7d0fe81e5f1a4aaa6462b015871074ae17eaa51fbe8f30163f6f1e1ae8"} Oct 12 20:39:32 crc kubenswrapper[4773]: I1012 20:39:32.571125 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36de8afd-4afa-44e6-9d8e-a6c8de0d4707","Type":"ContainerStarted","Data":"ff7ab913cf70bfc7acc2d49ac16a439d42227137b3387a23020f1913b05254dc"} Oct 12 20:39:32 crc kubenswrapper[4773]: I1012 20:39:32.573021 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3","Type":"ContainerStarted","Data":"c084c720b443d756cc911fdaf91f39912d929d0f6e679aeabcc4afc8d9674e4c"} Oct 12 20:39:33 crc kubenswrapper[4773]: I1012 20:39:33.581594 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"21606d72-32b0-4552-ac26-df0425f03cdf","Type":"ContainerStarted","Data":"b67d22d63f50b0658347e3eebd10fd4fbc2ef6456af65f960458ef5cdb193510"} Oct 12 20:39:35 crc kubenswrapper[4773]: I1012 20:39:35.594240 4773 generic.go:334] "Generic (PLEG): container finished" podID="0cb52152-8a83-4122-be94-0d2803fd5cc7" containerID="f7733a83362f6733ad231dbbc4d0aeace656f3b1c19ce9f58608933cf5f73e0b" exitCode=0 Oct 12 20:39:35 crc kubenswrapper[4773]: I1012 20:39:35.594644 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0cb52152-8a83-4122-be94-0d2803fd5cc7","Type":"ContainerDied","Data":"f7733a83362f6733ad231dbbc4d0aeace656f3b1c19ce9f58608933cf5f73e0b"} Oct 12 20:39:35 crc kubenswrapper[4773]: I1012 20:39:35.597119 4773 generic.go:334] "Generic (PLEG): container finished" podID="7566f11c-8e52-4fb6-b1a2-98b388ffefd9" containerID="20f2a12800ca877679321089f5d22f23e78e32b0cac5eb5981dc15783c50f650" exitCode=0 Oct 12 20:39:35 crc kubenswrapper[4773]: I1012 20:39:35.597146 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7566f11c-8e52-4fb6-b1a2-98b388ffefd9","Type":"ContainerDied","Data":"20f2a12800ca877679321089f5d22f23e78e32b0cac5eb5981dc15783c50f650"} Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.619769 4773 generic.go:334] "Generic (PLEG): container finished" podID="55ef70a8-016d-403f-ab02-820088160f9c" containerID="9a9bcc368b1a000ef9d98f2321cc8ce2a82bf805cda884b0356fbf31837f6504" exitCode=0 Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.619913 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wfpwq" event={"ID":"55ef70a8-016d-403f-ab02-820088160f9c","Type":"ContainerDied","Data":"9a9bcc368b1a000ef9d98f2321cc8ce2a82bf805cda884b0356fbf31837f6504"} Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.627119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7566f11c-8e52-4fb6-b1a2-98b388ffefd9","Type":"ContainerStarted","Data":"e4fb7c9ce35e74225c252d463b89281c458de169fafd06a87b2059e592b1a7bb"} Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.630531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sf74r" event={"ID":"1a08bcbe-fa8c-43b2-a4fb-ae2212de940d","Type":"ContainerStarted","Data":"1bc8379c8d40d23a3436e53c9f1b8f42c11b6aeb8bf12bf201f1a2cf4a1352bd"} Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.630650 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-sf74r" Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.632992 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dc85890c-cde2-470e-87de-4d69f1682bd0","Type":"ContainerStarted","Data":"28a414afc63dc79a02b628f6dcbfbfa83af25eab3450d0bb3ef651afaf33b877"} Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.643770 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0cb52152-8a83-4122-be94-0d2803fd5cc7","Type":"ContainerStarted","Data":"2c6bd8b8ee22f2da8e29878003ca71e427abe075a7681172383c10619a887197"} Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.645370 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8aeb19a9-db56-488b-9410-004f24e8d11a","Type":"ContainerStarted","Data":"4c0b59ff0e0393ab11832e54737ddedc38b3777cb8d15c8c67769d65a2c52ffe"} Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.645607 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.647425 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d9d2321-06c1-42e1-b4a9-d6a759d0eba0","Type":"ContainerStarted","Data":"91b6508dc536379c0a60353a79495e377d6674d08f921506b2aaa22f3d67246c"} Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.647560 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.649794 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"21606d72-32b0-4552-ac26-df0425f03cdf","Type":"ContainerStarted","Data":"2b7b7c45ab0206a46ae10d7fd0469bad44cdb78669bd150acf9238a1ad8f95cc"} Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.707129 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=16.14945222 podStartE2EDuration="30.707039727s" podCreationTimestamp="2025-10-12 20:39:07 +0000 UTC" firstStartedPulling="2025-10-12 20:39:16.114548684 +0000 UTC m=+904.350847244" lastFinishedPulling="2025-10-12 20:39:30.672136191 +0000 UTC m=+918.908434751" observedRunningTime="2025-10-12 20:39:37.671354163 +0000 UTC m=+925.907652723" watchObservedRunningTime="2025-10-12 20:39:37.707039727 +0000 UTC m=+925.943338287" Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.722196 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sf74r" podStartSLOduration=17.97666759 podStartE2EDuration="22.722178964s" podCreationTimestamp="2025-10-12 20:39:15 +0000 UTC" firstStartedPulling="2025-10-12 20:39:31.35632159 +0000 UTC m=+919.592620150" lastFinishedPulling="2025-10-12 20:39:36.101832964 +0000 UTC m=+924.338131524" observedRunningTime="2025-10-12 20:39:37.69590689 +0000 UTC m=+925.932205450" watchObservedRunningTime="2025-10-12 20:39:37.722178964 +0000 UTC m=+925.958477524" Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.732134 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.367245213 podStartE2EDuration="31.732115998s" podCreationTimestamp="2025-10-12 20:39:06 +0000 UTC" firstStartedPulling="2025-10-12 20:39:09.190791473 +0000 UTC m=+897.427090033" lastFinishedPulling="2025-10-12 20:39:30.555662258 +0000 UTC m=+918.791960818" observedRunningTime="2025-10-12 20:39:37.719513301 +0000 UTC m=+925.955811861" watchObservedRunningTime="2025-10-12 20:39:37.732115998 +0000 UTC m=+925.968414558" Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.743104 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.972118376 podStartE2EDuration="28.743087481s" podCreationTimestamp="2025-10-12 20:39:09 +0000 UTC" firstStartedPulling="2025-10-12 20:39:31.021452825 +0000 UTC m=+919.257751385" lastFinishedPulling="2025-10-12 20:39:35.79242193 +0000 UTC m=+924.028720490" observedRunningTime="2025-10-12 20:39:37.737654261 +0000 UTC m=+925.973952821" watchObservedRunningTime="2025-10-12 20:39:37.743087481 +0000 UTC m=+925.979386041" Oct 12 20:39:37 crc kubenswrapper[4773]: I1012 20:39:37.761501 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=21.366774059 podStartE2EDuration="26.761480808s" podCreationTimestamp="2025-10-12 20:39:11 +0000 UTC" firstStartedPulling="2025-10-12 20:39:31.326097576 +0000 UTC m=+919.562396136" lastFinishedPulling="2025-10-12 20:39:36.720804325 +0000 UTC m=+924.957102885" observedRunningTime="2025-10-12 20:39:37.757215321 +0000 UTC m=+925.993513881" watchObservedRunningTime="2025-10-12 20:39:37.761480808 +0000 UTC m=+925.997779368" Oct 12 20:39:38 crc kubenswrapper[4773]: I1012 20:39:38.390129 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 12 20:39:38 crc kubenswrapper[4773]: I1012 20:39:38.390407 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 12 20:39:38 crc kubenswrapper[4773]: I1012 20:39:38.661537 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wfpwq" event={"ID":"55ef70a8-016d-403f-ab02-820088160f9c","Type":"ContainerStarted","Data":"770393958ebd0ce82f286b02ff104e4a44aafdd94ff9f287a66c273ac0a9e24c"} Oct 12 20:39:38 crc kubenswrapper[4773]: I1012 20:39:38.661799 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wfpwq" event={"ID":"55ef70a8-016d-403f-ab02-820088160f9c","Type":"ContainerStarted","Data":"8dbe8225bef18f27f4508e1581214008f896069ea9a6bb35f785662abc51eee9"} Oct 12 20:39:38 crc kubenswrapper[4773]: I1012 20:39:38.661986 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:38 crc kubenswrapper[4773]: I1012 20:39:38.662029 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:39:38 crc kubenswrapper[4773]: I1012 20:39:38.687067 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-wfpwq" podStartSLOduration=20.047593048 podStartE2EDuration="23.687051545s" podCreationTimestamp="2025-10-12 20:39:15 +0000 UTC" firstStartedPulling="2025-10-12 20:39:32.385465035 +0000 UTC m=+920.621763595" lastFinishedPulling="2025-10-12 20:39:36.024923532 +0000 UTC m=+924.261222092" observedRunningTime="2025-10-12 20:39:38.676228367 +0000 UTC m=+926.912526927" watchObservedRunningTime="2025-10-12 20:39:38.687051545 +0000 UTC m=+926.923350105" Oct 12 20:39:39 crc kubenswrapper[4773]: I1012 20:39:39.389202 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:39 crc kubenswrapper[4773]: I1012 20:39:39.389905 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:40 crc kubenswrapper[4773]: I1012 20:39:40.676597 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"21606d72-32b0-4552-ac26-df0425f03cdf","Type":"ContainerStarted","Data":"8085fe2e7bb33c318fa15dfff47a1c944375b1ba56b43952d138b8943b817b00"} Oct 12 20:39:40 crc kubenswrapper[4773]: I1012 20:39:40.679664 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dc85890c-cde2-470e-87de-4d69f1682bd0","Type":"ContainerStarted","Data":"4d9d7c84838d28a4b5f8905222b7f3bae1d47aa7b829e441567fc483f2d84f96"} Oct 12 20:39:40 crc kubenswrapper[4773]: I1012 20:39:40.703804 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.755513432 podStartE2EDuration="24.703780158s" podCreationTimestamp="2025-10-12 20:39:16 +0000 UTC" firstStartedPulling="2025-10-12 20:39:33.303213867 +0000 UTC m=+921.539512447" lastFinishedPulling="2025-10-12 20:39:40.251480613 +0000 UTC m=+928.487779173" observedRunningTime="2025-10-12 20:39:40.696410875 +0000 UTC m=+928.932709445" watchObservedRunningTime="2025-10-12 20:39:40.703780158 +0000 UTC m=+928.940078718" Oct 12 20:39:40 crc kubenswrapper[4773]: I1012 20:39:40.717278 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.128918929 podStartE2EDuration="23.71726245s" podCreationTimestamp="2025-10-12 20:39:17 +0000 UTC" firstStartedPulling="2025-10-12 20:39:31.679397051 +0000 UTC m=+919.915695611" lastFinishedPulling="2025-10-12 20:39:40.267740572 +0000 UTC m=+928.504039132" observedRunningTime="2025-10-12 20:39:40.713563958 +0000 UTC m=+928.949862518" watchObservedRunningTime="2025-10-12 20:39:40.71726245 +0000 UTC m=+928.953561010" Oct 12 20:39:41 crc kubenswrapper[4773]: I1012 20:39:41.468487 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:41 crc kubenswrapper[4773]: I1012 20:39:41.534148 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 12 20:39:41 crc kubenswrapper[4773]: I1012 20:39:41.804316 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:41 crc kubenswrapper[4773]: I1012 20:39:41.852950 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:42 crc kubenswrapper[4773]: I1012 20:39:42.693382 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:42 crc kubenswrapper[4773]: I1012 20:39:42.735405 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 12 20:39:42 crc kubenswrapper[4773]: I1012 20:39:42.735869 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:42 crc kubenswrapper[4773]: I1012 20:39:42.778038 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.002158 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-hfxh9"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.027275 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-m8ts9"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.028786 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.038241 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.061607 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-m8ts9"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.137391 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7kldg"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.138344 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.141797 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.172337 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7kldg"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.193901 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57600c2-89e9-4db4-a846-48235987e13c-config\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.194707 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf4f8\" (UniqueName: \"kubernetes.io/projected/a4854fde-54e4-42d2-9d3b-fdf18455dd92-kube-api-access-vf4f8\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.195975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57600c2-89e9-4db4-a846-48235987e13c-combined-ca-bundle\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.196025 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b57600c2-89e9-4db4-a846-48235987e13c-ovn-rundir\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.196145 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57600c2-89e9-4db4-a846-48235987e13c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.196179 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-dns-svc\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.196207 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-ovsdbserver-sb\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.196231 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49q7s\" (UniqueName: \"kubernetes.io/projected/b57600c2-89e9-4db4-a846-48235987e13c-kube-api-access-49q7s\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.196293 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-config\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.196340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b57600c2-89e9-4db4-a846-48235987e13c-ovs-rundir\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297445 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-config\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297503 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b57600c2-89e9-4db4-a846-48235987e13c-ovs-rundir\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297553 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57600c2-89e9-4db4-a846-48235987e13c-config\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297585 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf4f8\" (UniqueName: \"kubernetes.io/projected/a4854fde-54e4-42d2-9d3b-fdf18455dd92-kube-api-access-vf4f8\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297614 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57600c2-89e9-4db4-a846-48235987e13c-combined-ca-bundle\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297633 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b57600c2-89e9-4db4-a846-48235987e13c-ovn-rundir\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297681 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57600c2-89e9-4db4-a846-48235987e13c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297700 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-dns-svc\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-ovsdbserver-sb\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.297808 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49q7s\" (UniqueName: \"kubernetes.io/projected/b57600c2-89e9-4db4-a846-48235987e13c-kube-api-access-49q7s\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.298175 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b57600c2-89e9-4db4-a846-48235987e13c-ovs-rundir\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.298255 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b57600c2-89e9-4db4-a846-48235987e13c-ovn-rundir\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.298634 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-config\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.298725 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57600c2-89e9-4db4-a846-48235987e13c-config\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.299199 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-dns-svc\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.299354 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-ovsdbserver-sb\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.306787 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57600c2-89e9-4db4-a846-48235987e13c-combined-ca-bundle\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.311214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57600c2-89e9-4db4-a846-48235987e13c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.326076 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf4f8\" (UniqueName: \"kubernetes.io/projected/a4854fde-54e4-42d2-9d3b-fdf18455dd92-kube-api-access-vf4f8\") pod \"dnsmasq-dns-54c9499b4f-m8ts9\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.328956 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49q7s\" (UniqueName: \"kubernetes.io/projected/b57600c2-89e9-4db4-a846-48235987e13c-kube-api-access-49q7s\") pod \"ovn-controller-metrics-7kldg\" (UID: \"b57600c2-89e9-4db4-a846-48235987e13c\") " pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.363039 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.455255 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7kldg" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.468072 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.501027 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmfbg\" (UniqueName: \"kubernetes.io/projected/9c48d718-b6a7-4625-9236-9dc52f24e3a3-kube-api-access-tmfbg\") pod \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.501117 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-dns-svc\") pod \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.504283 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c48d718-b6a7-4625-9236-9dc52f24e3a3" (UID: "9c48d718-b6a7-4625-9236-9dc52f24e3a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.504380 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-config\") pod \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\" (UID: \"9c48d718-b6a7-4625-9236-9dc52f24e3a3\") " Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.506087 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.512378 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c48d718-b6a7-4625-9236-9dc52f24e3a3-kube-api-access-tmfbg" (OuterVolumeSpecName: "kube-api-access-tmfbg") pod "9c48d718-b6a7-4625-9236-9dc52f24e3a3" (UID: "9c48d718-b6a7-4625-9236-9dc52f24e3a3"). InnerVolumeSpecName "kube-api-access-tmfbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.522194 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-config" (OuterVolumeSpecName: "config") pod "9c48d718-b6a7-4625-9236-9dc52f24e3a3" (UID: "9c48d718-b6a7-4625-9236-9dc52f24e3a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.540569 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-tf5bv"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.589909 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-vj269"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.591134 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.609767 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.611253 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-dns-svc\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.611306 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-nb\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.611356 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxdwc\" (UniqueName: \"kubernetes.io/projected/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-kube-api-access-rxdwc\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.611368 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-vj269"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.611378 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-sb\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.611509 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-config\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.611706 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmfbg\" (UniqueName: \"kubernetes.io/projected/9c48d718-b6a7-4625-9236-9dc52f24e3a3-kube-api-access-tmfbg\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.611737 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c48d718-b6a7-4625-9236-9dc52f24e3a3-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.711497 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-hfxh9" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.713534 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-hfxh9" event={"ID":"9c48d718-b6a7-4625-9236-9dc52f24e3a3","Type":"ContainerDied","Data":"66e456125a078db4bf713629371378eac31ec50b92d24f98be92f6c6b2a93c43"} Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.713598 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.715027 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxdwc\" (UniqueName: \"kubernetes.io/projected/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-kube-api-access-rxdwc\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.715057 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-sb\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.715089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-config\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.715159 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-dns-svc\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.715206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-nb\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.715982 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-nb\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.717526 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-config\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.718966 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-dns-svc\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.719518 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-sb\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.745152 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxdwc\" (UniqueName: \"kubernetes.io/projected/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-kube-api-access-rxdwc\") pod \"dnsmasq-dns-bc45f6dcf-vj269\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.773863 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-m8ts9"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.792934 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.833398 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-hfxh9"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.845835 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-hfxh9"] Oct 12 20:39:43 crc kubenswrapper[4773]: I1012 20:39:43.956862 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.033757 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.035046 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.036942 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.039575 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.040041 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-f5x84" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.040184 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.069760 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.115854 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.199281 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7kldg"] Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.229628 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-dns-svc\") pod \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.230120 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pwj4\" (UniqueName: \"kubernetes.io/projected/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-kube-api-access-5pwj4\") pod \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.230211 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-config\") pod \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\" (UID: \"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4\") " Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.230388 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a50ca31-4c77-488f-aed7-aa99e82677f0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.230427 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a50ca31-4c77-488f-aed7-aa99e82677f0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.230822 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37254faf-7d6e-4fe7-a4b5-4198c4cdafa4" (UID: "37254faf-7d6e-4fe7-a4b5-4198c4cdafa4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.230890 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-config" (OuterVolumeSpecName: "config") pod "37254faf-7d6e-4fe7-a4b5-4198c4cdafa4" (UID: "37254faf-7d6e-4fe7-a4b5-4198c4cdafa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.231455 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a50ca31-4c77-488f-aed7-aa99e82677f0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.231488 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rtl\" (UniqueName: \"kubernetes.io/projected/4a50ca31-4c77-488f-aed7-aa99e82677f0-kube-api-access-w8rtl\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.231511 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a50ca31-4c77-488f-aed7-aa99e82677f0-scripts\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.231558 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a50ca31-4c77-488f-aed7-aa99e82677f0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.231577 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a50ca31-4c77-488f-aed7-aa99e82677f0-config\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.232045 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.233575 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.241029 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-kube-api-access-5pwj4" (OuterVolumeSpecName: "kube-api-access-5pwj4") pod "37254faf-7d6e-4fe7-a4b5-4198c4cdafa4" (UID: "37254faf-7d6e-4fe7-a4b5-4198c4cdafa4"). InnerVolumeSpecName "kube-api-access-5pwj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.334990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rtl\" (UniqueName: \"kubernetes.io/projected/4a50ca31-4c77-488f-aed7-aa99e82677f0-kube-api-access-w8rtl\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.335054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a50ca31-4c77-488f-aed7-aa99e82677f0-scripts\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.335121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a50ca31-4c77-488f-aed7-aa99e82677f0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.335144 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a50ca31-4c77-488f-aed7-aa99e82677f0-config\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.335224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a50ca31-4c77-488f-aed7-aa99e82677f0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.335243 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a50ca31-4c77-488f-aed7-aa99e82677f0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.335289 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a50ca31-4c77-488f-aed7-aa99e82677f0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.335328 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pwj4\" (UniqueName: \"kubernetes.io/projected/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4-kube-api-access-5pwj4\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.336051 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a50ca31-4c77-488f-aed7-aa99e82677f0-config\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.336599 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a50ca31-4c77-488f-aed7-aa99e82677f0-scripts\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.336884 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a50ca31-4c77-488f-aed7-aa99e82677f0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.340201 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a50ca31-4c77-488f-aed7-aa99e82677f0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.340463 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a50ca31-4c77-488f-aed7-aa99e82677f0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.340951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a50ca31-4c77-488f-aed7-aa99e82677f0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.357280 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rtl\" (UniqueName: \"kubernetes.io/projected/4a50ca31-4c77-488f-aed7-aa99e82677f0-kube-api-access-w8rtl\") pod \"ovn-northd-0\" (UID: \"4a50ca31-4c77-488f-aed7-aa99e82677f0\") " pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.361058 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.491477 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c48d718-b6a7-4625-9236-9dc52f24e3a3" path="/var/lib/kubelet/pods/9c48d718-b6a7-4625-9236-9dc52f24e3a3/volumes" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.492369 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.525449 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-vj269"] Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.571841 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.706790 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.719029 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7kldg" event={"ID":"b57600c2-89e9-4db4-a846-48235987e13c","Type":"ContainerStarted","Data":"b5b535e61233aef8fd1c14c2ea91ea6085bcaccaec2928fd90d875a37645e414"} Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.720599 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" event={"ID":"df9b89cd-a401-439b-b7a3-2b3ddc3e780f","Type":"ContainerStarted","Data":"38f3853f59831fc2bd4c170e9f444fd8880c440c951c42ecf51047e6cd1ac946"} Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.721649 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" event={"ID":"a4854fde-54e4-42d2-9d3b-fdf18455dd92","Type":"ContainerStarted","Data":"fe66513a3f3dc82f476f784262ed2545adcd800f6c4c38b70ca00b1741b3bbf2"} Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.725209 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-tf5bv" event={"ID":"37254faf-7d6e-4fe7-a4b5-4198c4cdafa4","Type":"ContainerDied","Data":"129041033e247a7f108b5f44560d823744a294326e7a8cf1da62da381efdeb17"} Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.725903 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-tf5bv" Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.775550 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-tf5bv"] Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.792541 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77597f887-tf5bv"] Oct 12 20:39:44 crc kubenswrapper[4773]: I1012 20:39:44.854638 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 12 20:39:45 crc kubenswrapper[4773]: I1012 20:39:45.011454 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-brkxr"] Oct 12 20:39:45 crc kubenswrapper[4773]: I1012 20:39:45.012764 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-brkxr" Oct 12 20:39:45 crc kubenswrapper[4773]: I1012 20:39:45.024655 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-brkxr"] Oct 12 20:39:45 crc kubenswrapper[4773]: I1012 20:39:45.159784 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktp5z\" (UniqueName: \"kubernetes.io/projected/41844950-f353-4515-940a-61329fbb3d5f-kube-api-access-ktp5z\") pod \"glance-db-create-brkxr\" (UID: \"41844950-f353-4515-940a-61329fbb3d5f\") " pod="openstack/glance-db-create-brkxr" Oct 12 20:39:45 crc kubenswrapper[4773]: I1012 20:39:45.261676 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktp5z\" (UniqueName: \"kubernetes.io/projected/41844950-f353-4515-940a-61329fbb3d5f-kube-api-access-ktp5z\") pod \"glance-db-create-brkxr\" (UID: \"41844950-f353-4515-940a-61329fbb3d5f\") " pod="openstack/glance-db-create-brkxr" Oct 12 20:39:45 crc kubenswrapper[4773]: I1012 20:39:45.285575 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktp5z\" (UniqueName: \"kubernetes.io/projected/41844950-f353-4515-940a-61329fbb3d5f-kube-api-access-ktp5z\") pod \"glance-db-create-brkxr\" (UID: \"41844950-f353-4515-940a-61329fbb3d5f\") " pod="openstack/glance-db-create-brkxr" Oct 12 20:39:45 crc kubenswrapper[4773]: I1012 20:39:45.340972 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-brkxr" Oct 12 20:39:45 crc kubenswrapper[4773]: I1012 20:39:45.731911 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4a50ca31-4c77-488f-aed7-aa99e82677f0","Type":"ContainerStarted","Data":"d8268d26dc7d34a6193fbc417d99be8be1d549d6bf9d0f2df57f4f72f493d8eb"} Oct 12 20:39:45 crc kubenswrapper[4773]: I1012 20:39:45.780945 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-brkxr"] Oct 12 20:39:45 crc kubenswrapper[4773]: W1012 20:39:45.787254 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41844950_f353_4515_940a_61329fbb3d5f.slice/crio-97b4f72b2f4250b8f886449a12a6ea9cfa26891ee142c01467e553ba2b8741d7 WatchSource:0}: Error finding container 97b4f72b2f4250b8f886449a12a6ea9cfa26891ee142c01467e553ba2b8741d7: Status 404 returned error can't find the container with id 97b4f72b2f4250b8f886449a12a6ea9cfa26891ee142c01467e553ba2b8741d7 Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.492812 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37254faf-7d6e-4fe7-a4b5-4198c4cdafa4" path="/var/lib/kubelet/pods/37254faf-7d6e-4fe7-a4b5-4198c4cdafa4/volumes" Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.744659 4773 generic.go:334] "Generic (PLEG): container finished" podID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerID="70abfd27e14e742db0c164d492139a097d42d38da0b770bbae0e6750d2a66376" exitCode=0 Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.744789 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" event={"ID":"df9b89cd-a401-439b-b7a3-2b3ddc3e780f","Type":"ContainerDied","Data":"70abfd27e14e742db0c164d492139a097d42d38da0b770bbae0e6750d2a66376"} Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.748060 4773 generic.go:334] "Generic (PLEG): container finished" podID="41844950-f353-4515-940a-61329fbb3d5f" containerID="3a30840981f02fd0b2ecffb99b8b43305ef025d2611d0dda765881b0f1a47e8d" exitCode=0 Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.748221 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-brkxr" event={"ID":"41844950-f353-4515-940a-61329fbb3d5f","Type":"ContainerDied","Data":"3a30840981f02fd0b2ecffb99b8b43305ef025d2611d0dda765881b0f1a47e8d"} Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.748248 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-brkxr" event={"ID":"41844950-f353-4515-940a-61329fbb3d5f","Type":"ContainerStarted","Data":"97b4f72b2f4250b8f886449a12a6ea9cfa26891ee142c01467e553ba2b8741d7"} Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.750184 4773 generic.go:334] "Generic (PLEG): container finished" podID="a4854fde-54e4-42d2-9d3b-fdf18455dd92" containerID="51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41" exitCode=0 Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.750217 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" event={"ID":"a4854fde-54e4-42d2-9d3b-fdf18455dd92","Type":"ContainerDied","Data":"51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41"} Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.752510 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7kldg" event={"ID":"b57600c2-89e9-4db4-a846-48235987e13c","Type":"ContainerStarted","Data":"fecc2eb53fb6329dfb29b462a64abd713ddbb53d682649c9b97f88ad66d9a756"} Oct 12 20:39:46 crc kubenswrapper[4773]: I1012 20:39:46.810869 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7kldg" podStartSLOduration=3.810850804 podStartE2EDuration="3.810850804s" podCreationTimestamp="2025-10-12 20:39:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:39:46.805100295 +0000 UTC m=+935.041398865" watchObservedRunningTime="2025-10-12 20:39:46.810850804 +0000 UTC m=+935.047149364" Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.762686 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4a50ca31-4c77-488f-aed7-aa99e82677f0","Type":"ContainerStarted","Data":"d53a8a4000067e751c12840cb8b3afc31a5a117ddea6a695405133db7c9cbbbc"} Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.763003 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4a50ca31-4c77-488f-aed7-aa99e82677f0","Type":"ContainerStarted","Data":"868d0f54eedac0c6a61fe4ad4872b37c8eef1f56070e01350ec5781d4c013fac"} Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.763220 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.766417 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" event={"ID":"df9b89cd-a401-439b-b7a3-2b3ddc3e780f","Type":"ContainerStarted","Data":"9da522991ffed53932b5dbe9c96edf5596e121322998240cd0ac870bf2ec730a"} Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.766963 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.771118 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" event={"ID":"a4854fde-54e4-42d2-9d3b-fdf18455dd92","Type":"ContainerStarted","Data":"61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398"} Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.771156 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.787102 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.6653320310000002 podStartE2EDuration="3.787081479s" podCreationTimestamp="2025-10-12 20:39:44 +0000 UTC" firstStartedPulling="2025-10-12 20:39:44.859751572 +0000 UTC m=+933.096050132" lastFinishedPulling="2025-10-12 20:39:46.98150103 +0000 UTC m=+935.217799580" observedRunningTime="2025-10-12 20:39:47.783907471 +0000 UTC m=+936.020206041" watchObservedRunningTime="2025-10-12 20:39:47.787081479 +0000 UTC m=+936.023380049" Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.817081 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" podStartSLOduration=3.094788045 podStartE2EDuration="4.817038115s" podCreationTimestamp="2025-10-12 20:39:43 +0000 UTC" firstStartedPulling="2025-10-12 20:39:44.541929796 +0000 UTC m=+932.778228356" lastFinishedPulling="2025-10-12 20:39:46.264179866 +0000 UTC m=+934.500478426" observedRunningTime="2025-10-12 20:39:47.810232887 +0000 UTC m=+936.046531457" watchObservedRunningTime="2025-10-12 20:39:47.817038115 +0000 UTC m=+936.053336675" Oct 12 20:39:47 crc kubenswrapper[4773]: I1012 20:39:47.835693 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" podStartSLOduration=2.630559942 podStartE2EDuration="4.835669249s" podCreationTimestamp="2025-10-12 20:39:43 +0000 UTC" firstStartedPulling="2025-10-12 20:39:43.813947128 +0000 UTC m=+932.050245688" lastFinishedPulling="2025-10-12 20:39:46.019056435 +0000 UTC m=+934.255354995" observedRunningTime="2025-10-12 20:39:47.835329309 +0000 UTC m=+936.071627879" watchObservedRunningTime="2025-10-12 20:39:47.835669249 +0000 UTC m=+936.071967819" Oct 12 20:39:48 crc kubenswrapper[4773]: I1012 20:39:48.094980 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-brkxr" Oct 12 20:39:48 crc kubenswrapper[4773]: I1012 20:39:48.221606 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktp5z\" (UniqueName: \"kubernetes.io/projected/41844950-f353-4515-940a-61329fbb3d5f-kube-api-access-ktp5z\") pod \"41844950-f353-4515-940a-61329fbb3d5f\" (UID: \"41844950-f353-4515-940a-61329fbb3d5f\") " Oct 12 20:39:48 crc kubenswrapper[4773]: I1012 20:39:48.229231 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41844950-f353-4515-940a-61329fbb3d5f-kube-api-access-ktp5z" (OuterVolumeSpecName: "kube-api-access-ktp5z") pod "41844950-f353-4515-940a-61329fbb3d5f" (UID: "41844950-f353-4515-940a-61329fbb3d5f"). InnerVolumeSpecName "kube-api-access-ktp5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:39:48 crc kubenswrapper[4773]: I1012 20:39:48.323786 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktp5z\" (UniqueName: \"kubernetes.io/projected/41844950-f353-4515-940a-61329fbb3d5f-kube-api-access-ktp5z\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:48 crc kubenswrapper[4773]: I1012 20:39:48.794664 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-brkxr" event={"ID":"41844950-f353-4515-940a-61329fbb3d5f","Type":"ContainerDied","Data":"97b4f72b2f4250b8f886449a12a6ea9cfa26891ee142c01467e553ba2b8741d7"} Oct 12 20:39:48 crc kubenswrapper[4773]: I1012 20:39:48.794745 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97b4f72b2f4250b8f886449a12a6ea9cfa26891ee142c01467e553ba2b8741d7" Oct 12 20:39:48 crc kubenswrapper[4773]: I1012 20:39:48.794874 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-brkxr" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.293194 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4qddn"] Oct 12 20:39:49 crc kubenswrapper[4773]: E1012 20:39:49.293866 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41844950-f353-4515-940a-61329fbb3d5f" containerName="mariadb-database-create" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.293909 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="41844950-f353-4515-940a-61329fbb3d5f" containerName="mariadb-database-create" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.294349 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="41844950-f353-4515-940a-61329fbb3d5f" containerName="mariadb-database-create" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.295688 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qddn" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.312487 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4qddn"] Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.441520 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plzph\" (UniqueName: \"kubernetes.io/projected/e218f6c6-b871-4b56-94a8-64ea740b4b9f-kube-api-access-plzph\") pod \"keystone-db-create-4qddn\" (UID: \"e218f6c6-b871-4b56-94a8-64ea740b4b9f\") " pod="openstack/keystone-db-create-4qddn" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.543692 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plzph\" (UniqueName: \"kubernetes.io/projected/e218f6c6-b871-4b56-94a8-64ea740b4b9f-kube-api-access-plzph\") pod \"keystone-db-create-4qddn\" (UID: \"e218f6c6-b871-4b56-94a8-64ea740b4b9f\") " pod="openstack/keystone-db-create-4qddn" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.567285 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plzph\" (UniqueName: \"kubernetes.io/projected/e218f6c6-b871-4b56-94a8-64ea740b4b9f-kube-api-access-plzph\") pod \"keystone-db-create-4qddn\" (UID: \"e218f6c6-b871-4b56-94a8-64ea740b4b9f\") " pod="openstack/keystone-db-create-4qddn" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.622404 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qddn" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.794373 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-b96j8"] Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.796480 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b96j8" Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.816210 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b96j8"] Oct 12 20:39:49 crc kubenswrapper[4773]: I1012 20:39:49.949372 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwqhl\" (UniqueName: \"kubernetes.io/projected/1624c85d-701b-4850-a453-33f18a09e91a-kube-api-access-pwqhl\") pod \"placement-db-create-b96j8\" (UID: \"1624c85d-701b-4850-a453-33f18a09e91a\") " pod="openstack/placement-db-create-b96j8" Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.050991 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwqhl\" (UniqueName: \"kubernetes.io/projected/1624c85d-701b-4850-a453-33f18a09e91a-kube-api-access-pwqhl\") pod \"placement-db-create-b96j8\" (UID: \"1624c85d-701b-4850-a453-33f18a09e91a\") " pod="openstack/placement-db-create-b96j8" Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.072573 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwqhl\" (UniqueName: \"kubernetes.io/projected/1624c85d-701b-4850-a453-33f18a09e91a-kube-api-access-pwqhl\") pod \"placement-db-create-b96j8\" (UID: \"1624c85d-701b-4850-a453-33f18a09e91a\") " pod="openstack/placement-db-create-b96j8" Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.105554 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4qddn"] Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.124775 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b96j8" Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.556610 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b96j8"] Oct 12 20:39:50 crc kubenswrapper[4773]: W1012 20:39:50.561948 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1624c85d_701b_4850_a453_33f18a09e91a.slice/crio-9456297b683372fc889446249a23b2bab0156e4a55b8558f33bee3fc7fc710cd WatchSource:0}: Error finding container 9456297b683372fc889446249a23b2bab0156e4a55b8558f33bee3fc7fc710cd: Status 404 returned error can't find the container with id 9456297b683372fc889446249a23b2bab0156e4a55b8558f33bee3fc7fc710cd Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.823272 4773 generic.go:334] "Generic (PLEG): container finished" podID="e218f6c6-b871-4b56-94a8-64ea740b4b9f" containerID="bf8aeee3faec45c8318d83568ea26013fa7e31a4b78280cad3a8607c346c575c" exitCode=0 Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.823325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4qddn" event={"ID":"e218f6c6-b871-4b56-94a8-64ea740b4b9f","Type":"ContainerDied","Data":"bf8aeee3faec45c8318d83568ea26013fa7e31a4b78280cad3a8607c346c575c"} Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.823349 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4qddn" event={"ID":"e218f6c6-b871-4b56-94a8-64ea740b4b9f","Type":"ContainerStarted","Data":"726a0cc4f3f59fa72e0666e5edaa7d1240ed24034d33ab1f1082177b568631e2"} Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.827696 4773 generic.go:334] "Generic (PLEG): container finished" podID="1624c85d-701b-4850-a453-33f18a09e91a" containerID="f928909672358bcca5c3eb29b474968a73c96b3dba1149d4ea169115ebdc7623" exitCode=0 Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.827751 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b96j8" event={"ID":"1624c85d-701b-4850-a453-33f18a09e91a","Type":"ContainerDied","Data":"f928909672358bcca5c3eb29b474968a73c96b3dba1149d4ea169115ebdc7623"} Oct 12 20:39:50 crc kubenswrapper[4773]: I1012 20:39:50.827783 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b96j8" event={"ID":"1624c85d-701b-4850-a453-33f18a09e91a","Type":"ContainerStarted","Data":"9456297b683372fc889446249a23b2bab0156e4a55b8558f33bee3fc7fc710cd"} Oct 12 20:39:51 crc kubenswrapper[4773]: I1012 20:39:51.597069 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.204490 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b96j8" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.209279 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qddn" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.291696 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwqhl\" (UniqueName: \"kubernetes.io/projected/1624c85d-701b-4850-a453-33f18a09e91a-kube-api-access-pwqhl\") pod \"1624c85d-701b-4850-a453-33f18a09e91a\" (UID: \"1624c85d-701b-4850-a453-33f18a09e91a\") " Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.291844 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plzph\" (UniqueName: \"kubernetes.io/projected/e218f6c6-b871-4b56-94a8-64ea740b4b9f-kube-api-access-plzph\") pod \"e218f6c6-b871-4b56-94a8-64ea740b4b9f\" (UID: \"e218f6c6-b871-4b56-94a8-64ea740b4b9f\") " Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.297338 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e218f6c6-b871-4b56-94a8-64ea740b4b9f-kube-api-access-plzph" (OuterVolumeSpecName: "kube-api-access-plzph") pod "e218f6c6-b871-4b56-94a8-64ea740b4b9f" (UID: "e218f6c6-b871-4b56-94a8-64ea740b4b9f"). InnerVolumeSpecName "kube-api-access-plzph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.297687 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1624c85d-701b-4850-a453-33f18a09e91a-kube-api-access-pwqhl" (OuterVolumeSpecName: "kube-api-access-pwqhl") pod "1624c85d-701b-4850-a453-33f18a09e91a" (UID: "1624c85d-701b-4850-a453-33f18a09e91a"). InnerVolumeSpecName "kube-api-access-pwqhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.394148 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwqhl\" (UniqueName: \"kubernetes.io/projected/1624c85d-701b-4850-a453-33f18a09e91a-kube-api-access-pwqhl\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.394202 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plzph\" (UniqueName: \"kubernetes.io/projected/e218f6c6-b871-4b56-94a8-64ea740b4b9f-kube-api-access-plzph\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.842578 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b96j8" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.842589 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b96j8" event={"ID":"1624c85d-701b-4850-a453-33f18a09e91a","Type":"ContainerDied","Data":"9456297b683372fc889446249a23b2bab0156e4a55b8558f33bee3fc7fc710cd"} Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.843350 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9456297b683372fc889446249a23b2bab0156e4a55b8558f33bee3fc7fc710cd" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.844233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4qddn" event={"ID":"e218f6c6-b871-4b56-94a8-64ea740b4b9f","Type":"ContainerDied","Data":"726a0cc4f3f59fa72e0666e5edaa7d1240ed24034d33ab1f1082177b568631e2"} Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.844275 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="726a0cc4f3f59fa72e0666e5edaa7d1240ed24034d33ab1f1082177b568631e2" Oct 12 20:39:52 crc kubenswrapper[4773]: I1012 20:39:52.844335 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qddn" Oct 12 20:39:53 crc kubenswrapper[4773]: I1012 20:39:53.365606 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:53 crc kubenswrapper[4773]: I1012 20:39:53.958890 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.040199 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-m8ts9"] Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.040761 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" podUID="a4854fde-54e4-42d2-9d3b-fdf18455dd92" containerName="dnsmasq-dns" containerID="cri-o://61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398" gracePeriod=10 Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.452255 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.530924 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf4f8\" (UniqueName: \"kubernetes.io/projected/a4854fde-54e4-42d2-9d3b-fdf18455dd92-kube-api-access-vf4f8\") pod \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.530994 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-dns-svc\") pod \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.531031 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-ovsdbserver-sb\") pod \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.531080 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-config\") pod \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\" (UID: \"a4854fde-54e4-42d2-9d3b-fdf18455dd92\") " Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.537986 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4854fde-54e4-42d2-9d3b-fdf18455dd92-kube-api-access-vf4f8" (OuterVolumeSpecName: "kube-api-access-vf4f8") pod "a4854fde-54e4-42d2-9d3b-fdf18455dd92" (UID: "a4854fde-54e4-42d2-9d3b-fdf18455dd92"). InnerVolumeSpecName "kube-api-access-vf4f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.575359 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4854fde-54e4-42d2-9d3b-fdf18455dd92" (UID: "a4854fde-54e4-42d2-9d3b-fdf18455dd92"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.586398 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4854fde-54e4-42d2-9d3b-fdf18455dd92" (UID: "a4854fde-54e4-42d2-9d3b-fdf18455dd92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.589432 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-config" (OuterVolumeSpecName: "config") pod "a4854fde-54e4-42d2-9d3b-fdf18455dd92" (UID: "a4854fde-54e4-42d2-9d3b-fdf18455dd92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.632739 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf4f8\" (UniqueName: \"kubernetes.io/projected/a4854fde-54e4-42d2-9d3b-fdf18455dd92-kube-api-access-vf4f8\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.632781 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.632793 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.632805 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4854fde-54e4-42d2-9d3b-fdf18455dd92-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.857655 4773 generic.go:334] "Generic (PLEG): container finished" podID="a4854fde-54e4-42d2-9d3b-fdf18455dd92" containerID="61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398" exitCode=0 Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.857696 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" event={"ID":"a4854fde-54e4-42d2-9d3b-fdf18455dd92","Type":"ContainerDied","Data":"61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398"} Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.857735 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" event={"ID":"a4854fde-54e4-42d2-9d3b-fdf18455dd92","Type":"ContainerDied","Data":"fe66513a3f3dc82f476f784262ed2545adcd800f6c4c38b70ca00b1741b3bbf2"} Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.857752 4773 scope.go:117] "RemoveContainer" containerID="61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.857795 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c9499b4f-m8ts9" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.876636 4773 scope.go:117] "RemoveContainer" containerID="51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.888446 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-m8ts9"] Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.894457 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-m8ts9"] Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.911937 4773 scope.go:117] "RemoveContainer" containerID="61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398" Oct 12 20:39:54 crc kubenswrapper[4773]: E1012 20:39:54.913153 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398\": container with ID starting with 61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398 not found: ID does not exist" containerID="61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.913193 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398"} err="failed to get container status \"61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398\": rpc error: code = NotFound desc = could not find container \"61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398\": container with ID starting with 61971263d20c784ca7942660f746a76fe77935b64b392635dc6ec7bcead64398 not found: ID does not exist" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.913220 4773 scope.go:117] "RemoveContainer" containerID="51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41" Oct 12 20:39:54 crc kubenswrapper[4773]: E1012 20:39:54.913435 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41\": container with ID starting with 51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41 not found: ID does not exist" containerID="51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41" Oct 12 20:39:54 crc kubenswrapper[4773]: I1012 20:39:54.913474 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41"} err="failed to get container status \"51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41\": rpc error: code = NotFound desc = could not find container \"51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41\": container with ID starting with 51ac457b63fe5e4a9ce157f7bf306edd515ab45642a13deb4afd441970c7eb41 not found: ID does not exist" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.015251 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-66fa-account-create-mvgsr"] Oct 12 20:39:55 crc kubenswrapper[4773]: E1012 20:39:55.015535 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4854fde-54e4-42d2-9d3b-fdf18455dd92" containerName="init" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.015547 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4854fde-54e4-42d2-9d3b-fdf18455dd92" containerName="init" Oct 12 20:39:55 crc kubenswrapper[4773]: E1012 20:39:55.015555 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1624c85d-701b-4850-a453-33f18a09e91a" containerName="mariadb-database-create" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.015561 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1624c85d-701b-4850-a453-33f18a09e91a" containerName="mariadb-database-create" Oct 12 20:39:55 crc kubenswrapper[4773]: E1012 20:39:55.015572 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4854fde-54e4-42d2-9d3b-fdf18455dd92" containerName="dnsmasq-dns" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.015579 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4854fde-54e4-42d2-9d3b-fdf18455dd92" containerName="dnsmasq-dns" Oct 12 20:39:55 crc kubenswrapper[4773]: E1012 20:39:55.015594 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e218f6c6-b871-4b56-94a8-64ea740b4b9f" containerName="mariadb-database-create" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.015600 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e218f6c6-b871-4b56-94a8-64ea740b4b9f" containerName="mariadb-database-create" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.015897 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4854fde-54e4-42d2-9d3b-fdf18455dd92" containerName="dnsmasq-dns" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.015911 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1624c85d-701b-4850-a453-33f18a09e91a" containerName="mariadb-database-create" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.015926 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e218f6c6-b871-4b56-94a8-64ea740b4b9f" containerName="mariadb-database-create" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.016400 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-66fa-account-create-mvgsr" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.021848 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.026736 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-66fa-account-create-mvgsr"] Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.139618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb2kl\" (UniqueName: \"kubernetes.io/projected/7ed5c0a6-a628-4c72-acf3-f61de6844a5e-kube-api-access-bb2kl\") pod \"glance-66fa-account-create-mvgsr\" (UID: \"7ed5c0a6-a628-4c72-acf3-f61de6844a5e\") " pod="openstack/glance-66fa-account-create-mvgsr" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.240997 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb2kl\" (UniqueName: \"kubernetes.io/projected/7ed5c0a6-a628-4c72-acf3-f61de6844a5e-kube-api-access-bb2kl\") pod \"glance-66fa-account-create-mvgsr\" (UID: \"7ed5c0a6-a628-4c72-acf3-f61de6844a5e\") " pod="openstack/glance-66fa-account-create-mvgsr" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.260649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb2kl\" (UniqueName: \"kubernetes.io/projected/7ed5c0a6-a628-4c72-acf3-f61de6844a5e-kube-api-access-bb2kl\") pod \"glance-66fa-account-create-mvgsr\" (UID: \"7ed5c0a6-a628-4c72-acf3-f61de6844a5e\") " pod="openstack/glance-66fa-account-create-mvgsr" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.383812 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-66fa-account-create-mvgsr" Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.777827 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-66fa-account-create-mvgsr"] Oct 12 20:39:55 crc kubenswrapper[4773]: I1012 20:39:55.865096 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-66fa-account-create-mvgsr" event={"ID":"7ed5c0a6-a628-4c72-acf3-f61de6844a5e","Type":"ContainerStarted","Data":"a73e717e442621915548f2ea7ab0685a89490ca6781a5dbec94c0f93eb3a0f0f"} Oct 12 20:39:56 crc kubenswrapper[4773]: I1012 20:39:56.495632 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4854fde-54e4-42d2-9d3b-fdf18455dd92" path="/var/lib/kubelet/pods/a4854fde-54e4-42d2-9d3b-fdf18455dd92/volumes" Oct 12 20:39:56 crc kubenswrapper[4773]: I1012 20:39:56.874585 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ed5c0a6-a628-4c72-acf3-f61de6844a5e" containerID="55e384668e25bfc563f883f9ba1e2514f006288aa70430fd36ed48d3c4d9e66d" exitCode=0 Oct 12 20:39:56 crc kubenswrapper[4773]: I1012 20:39:56.874683 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-66fa-account-create-mvgsr" event={"ID":"7ed5c0a6-a628-4c72-acf3-f61de6844a5e","Type":"ContainerDied","Data":"55e384668e25bfc563f883f9ba1e2514f006288aa70430fd36ed48d3c4d9e66d"} Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.146917 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-66fa-account-create-mvgsr" Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.331166 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb2kl\" (UniqueName: \"kubernetes.io/projected/7ed5c0a6-a628-4c72-acf3-f61de6844a5e-kube-api-access-bb2kl\") pod \"7ed5c0a6-a628-4c72-acf3-f61de6844a5e\" (UID: \"7ed5c0a6-a628-4c72-acf3-f61de6844a5e\") " Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.348896 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed5c0a6-a628-4c72-acf3-f61de6844a5e-kube-api-access-bb2kl" (OuterVolumeSpecName: "kube-api-access-bb2kl") pod "7ed5c0a6-a628-4c72-acf3-f61de6844a5e" (UID: "7ed5c0a6-a628-4c72-acf3-f61de6844a5e"). InnerVolumeSpecName "kube-api-access-bb2kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.433164 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb2kl\" (UniqueName: \"kubernetes.io/projected/7ed5c0a6-a628-4c72-acf3-f61de6844a5e-kube-api-access-bb2kl\") on node \"crc\" DevicePath \"\"" Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.669660 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.669738 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.669793 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.670494 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eac722170f5344e043159ef0831f8b64693997069824d20f87b36a000f16f635"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.670560 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://eac722170f5344e043159ef0831f8b64693997069824d20f87b36a000f16f635" gracePeriod=600 Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.899068 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-66fa-account-create-mvgsr" event={"ID":"7ed5c0a6-a628-4c72-acf3-f61de6844a5e","Type":"ContainerDied","Data":"a73e717e442621915548f2ea7ab0685a89490ca6781a5dbec94c0f93eb3a0f0f"} Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.899118 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-66fa-account-create-mvgsr" Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.899124 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a73e717e442621915548f2ea7ab0685a89490ca6781a5dbec94c0f93eb3a0f0f" Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.903618 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="eac722170f5344e043159ef0831f8b64693997069824d20f87b36a000f16f635" exitCode=0 Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.903656 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"eac722170f5344e043159ef0831f8b64693997069824d20f87b36a000f16f635"} Oct 12 20:39:58 crc kubenswrapper[4773]: I1012 20:39:58.903684 4773 scope.go:117] "RemoveContainer" containerID="f52dd857ebd7841601e1ebc902a98c37025d34641286d29646b2dbc4969a08aa" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.447814 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.518215 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b8df-account-create-vsp9w"] Oct 12 20:39:59 crc kubenswrapper[4773]: E1012 20:39:59.518960 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed5c0a6-a628-4c72-acf3-f61de6844a5e" containerName="mariadb-account-create" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.519051 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed5c0a6-a628-4c72-acf3-f61de6844a5e" containerName="mariadb-account-create" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.519278 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed5c0a6-a628-4c72-acf3-f61de6844a5e" containerName="mariadb-account-create" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.519891 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b8df-account-create-vsp9w" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.523903 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.524577 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b8df-account-create-vsp9w"] Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.650888 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzntq\" (UniqueName: \"kubernetes.io/projected/0c621cba-c18f-4ffd-9685-66cc229b846e-kube-api-access-kzntq\") pod \"keystone-b8df-account-create-vsp9w\" (UID: \"0c621cba-c18f-4ffd-9685-66cc229b846e\") " pod="openstack/keystone-b8df-account-create-vsp9w" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.752537 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzntq\" (UniqueName: \"kubernetes.io/projected/0c621cba-c18f-4ffd-9685-66cc229b846e-kube-api-access-kzntq\") pod \"keystone-b8df-account-create-vsp9w\" (UID: \"0c621cba-c18f-4ffd-9685-66cc229b846e\") " pod="openstack/keystone-b8df-account-create-vsp9w" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.773364 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzntq\" (UniqueName: \"kubernetes.io/projected/0c621cba-c18f-4ffd-9685-66cc229b846e-kube-api-access-kzntq\") pod \"keystone-b8df-account-create-vsp9w\" (UID: \"0c621cba-c18f-4ffd-9685-66cc229b846e\") " pod="openstack/keystone-b8df-account-create-vsp9w" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.826583 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f309-account-create-f5mnr"] Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.827500 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f309-account-create-f5mnr" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.829352 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.834788 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b8df-account-create-vsp9w" Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.838798 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f309-account-create-f5mnr"] Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.914436 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"0933e9f4241f82c41af0f2d2f4870feff1ad7b281c06f5be9e23c636fa021737"} Oct 12 20:39:59 crc kubenswrapper[4773]: I1012 20:39:59.962736 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krgsn\" (UniqueName: \"kubernetes.io/projected/01ef866a-4e30-45b3-b35f-18ebf81b265d-kube-api-access-krgsn\") pod \"placement-f309-account-create-f5mnr\" (UID: \"01ef866a-4e30-45b3-b35f-18ebf81b265d\") " pod="openstack/placement-f309-account-create-f5mnr" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.064376 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krgsn\" (UniqueName: \"kubernetes.io/projected/01ef866a-4e30-45b3-b35f-18ebf81b265d-kube-api-access-krgsn\") pod \"placement-f309-account-create-f5mnr\" (UID: \"01ef866a-4e30-45b3-b35f-18ebf81b265d\") " pod="openstack/placement-f309-account-create-f5mnr" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.080123 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krgsn\" (UniqueName: \"kubernetes.io/projected/01ef866a-4e30-45b3-b35f-18ebf81b265d-kube-api-access-krgsn\") pod \"placement-f309-account-create-f5mnr\" (UID: \"01ef866a-4e30-45b3-b35f-18ebf81b265d\") " pod="openstack/placement-f309-account-create-f5mnr" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.150956 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f309-account-create-f5mnr" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.157333 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-dls7h"] Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.158313 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.160297 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.162170 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fb9l2" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.172310 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dls7h"] Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.270401 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-config-data\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.270710 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzqd\" (UniqueName: \"kubernetes.io/projected/8de57e72-abb2-4344-a3e3-efa878f91a88-kube-api-access-qrzqd\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.270877 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-combined-ca-bundle\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.270923 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-db-sync-config-data\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.327511 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b8df-account-create-vsp9w"] Oct 12 20:40:00 crc kubenswrapper[4773]: W1012 20:40:00.342752 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c621cba_c18f_4ffd_9685_66cc229b846e.slice/crio-c2808c9ed7dd114a156bd307b0b4ea6e7959d268c43a76eb379e490ab4410cb9 WatchSource:0}: Error finding container c2808c9ed7dd114a156bd307b0b4ea6e7959d268c43a76eb379e490ab4410cb9: Status 404 returned error can't find the container with id c2808c9ed7dd114a156bd307b0b4ea6e7959d268c43a76eb379e490ab4410cb9 Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.372994 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-combined-ca-bundle\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.373069 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-db-sync-config-data\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.373117 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-config-data\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.374026 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzqd\" (UniqueName: \"kubernetes.io/projected/8de57e72-abb2-4344-a3e3-efa878f91a88-kube-api-access-qrzqd\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.378888 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-combined-ca-bundle\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.381184 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-config-data\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.381580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-db-sync-config-data\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.389025 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzqd\" (UniqueName: \"kubernetes.io/projected/8de57e72-abb2-4344-a3e3-efa878f91a88-kube-api-access-qrzqd\") pod \"glance-db-sync-dls7h\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.521243 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.628796 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f309-account-create-f5mnr"] Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.929777 4773 generic.go:334] "Generic (PLEG): container finished" podID="01ef866a-4e30-45b3-b35f-18ebf81b265d" containerID="03dd167bc167fc42badc5400046408efe0f45014cb730618b7ae86e4601709df" exitCode=0 Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.930125 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f309-account-create-f5mnr" event={"ID":"01ef866a-4e30-45b3-b35f-18ebf81b265d","Type":"ContainerDied","Data":"03dd167bc167fc42badc5400046408efe0f45014cb730618b7ae86e4601709df"} Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.930153 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f309-account-create-f5mnr" event={"ID":"01ef866a-4e30-45b3-b35f-18ebf81b265d","Type":"ContainerStarted","Data":"53b2c75d2ba813d868d2abe6e3508c6bcfda08d4d653205c78758c1c1ead7807"} Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.931869 4773 generic.go:334] "Generic (PLEG): container finished" podID="0c621cba-c18f-4ffd-9685-66cc229b846e" containerID="e7b52457b38890c56a01887aa0151d553d4fa6dd83b807f6423c12478a644561" exitCode=0 Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.932515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b8df-account-create-vsp9w" event={"ID":"0c621cba-c18f-4ffd-9685-66cc229b846e","Type":"ContainerDied","Data":"e7b52457b38890c56a01887aa0151d553d4fa6dd83b807f6423c12478a644561"} Oct 12 20:40:00 crc kubenswrapper[4773]: I1012 20:40:00.932536 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b8df-account-create-vsp9w" event={"ID":"0c621cba-c18f-4ffd-9685-66cc229b846e","Type":"ContainerStarted","Data":"c2808c9ed7dd114a156bd307b0b4ea6e7959d268c43a76eb379e490ab4410cb9"} Oct 12 20:40:01 crc kubenswrapper[4773]: I1012 20:40:01.041859 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dls7h"] Oct 12 20:40:01 crc kubenswrapper[4773]: I1012 20:40:01.943844 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dls7h" event={"ID":"8de57e72-abb2-4344-a3e3-efa878f91a88","Type":"ContainerStarted","Data":"50546bb7b02699efe7dc6c4eeab3c9457ca46d9e5d85ccbfe68d798cb59f10fc"} Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.294620 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f309-account-create-f5mnr" Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.306090 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b8df-account-create-vsp9w" Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.412740 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzntq\" (UniqueName: \"kubernetes.io/projected/0c621cba-c18f-4ffd-9685-66cc229b846e-kube-api-access-kzntq\") pod \"0c621cba-c18f-4ffd-9685-66cc229b846e\" (UID: \"0c621cba-c18f-4ffd-9685-66cc229b846e\") " Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.412993 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krgsn\" (UniqueName: \"kubernetes.io/projected/01ef866a-4e30-45b3-b35f-18ebf81b265d-kube-api-access-krgsn\") pod \"01ef866a-4e30-45b3-b35f-18ebf81b265d\" (UID: \"01ef866a-4e30-45b3-b35f-18ebf81b265d\") " Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.418830 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c621cba-c18f-4ffd-9685-66cc229b846e-kube-api-access-kzntq" (OuterVolumeSpecName: "kube-api-access-kzntq") pod "0c621cba-c18f-4ffd-9685-66cc229b846e" (UID: "0c621cba-c18f-4ffd-9685-66cc229b846e"). InnerVolumeSpecName "kube-api-access-kzntq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.423938 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ef866a-4e30-45b3-b35f-18ebf81b265d-kube-api-access-krgsn" (OuterVolumeSpecName: "kube-api-access-krgsn") pod "01ef866a-4e30-45b3-b35f-18ebf81b265d" (UID: "01ef866a-4e30-45b3-b35f-18ebf81b265d"). InnerVolumeSpecName "kube-api-access-krgsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.515180 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krgsn\" (UniqueName: \"kubernetes.io/projected/01ef866a-4e30-45b3-b35f-18ebf81b265d-kube-api-access-krgsn\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.515206 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzntq\" (UniqueName: \"kubernetes.io/projected/0c621cba-c18f-4ffd-9685-66cc229b846e-kube-api-access-kzntq\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.957849 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f309-account-create-f5mnr" event={"ID":"01ef866a-4e30-45b3-b35f-18ebf81b265d","Type":"ContainerDied","Data":"53b2c75d2ba813d868d2abe6e3508c6bcfda08d4d653205c78758c1c1ead7807"} Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.957887 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53b2c75d2ba813d868d2abe6e3508c6bcfda08d4d653205c78758c1c1ead7807" Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.957862 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f309-account-create-f5mnr" Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.961679 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b8df-account-create-vsp9w" event={"ID":"0c621cba-c18f-4ffd-9685-66cc229b846e","Type":"ContainerDied","Data":"c2808c9ed7dd114a156bd307b0b4ea6e7959d268c43a76eb379e490ab4410cb9"} Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.961704 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2808c9ed7dd114a156bd307b0b4ea6e7959d268c43a76eb379e490ab4410cb9" Oct 12 20:40:02 crc kubenswrapper[4773]: I1012 20:40:02.961747 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b8df-account-create-vsp9w" Oct 12 20:40:03 crc kubenswrapper[4773]: I1012 20:40:03.972175 4773 generic.go:334] "Generic (PLEG): container finished" podID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" containerID="ff7ab913cf70bfc7acc2d49ac16a439d42227137b3387a23020f1913b05254dc" exitCode=0 Oct 12 20:40:03 crc kubenswrapper[4773]: I1012 20:40:03.972223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36de8afd-4afa-44e6-9d8e-a6c8de0d4707","Type":"ContainerDied","Data":"ff7ab913cf70bfc7acc2d49ac16a439d42227137b3387a23020f1913b05254dc"} Oct 12 20:40:03 crc kubenswrapper[4773]: I1012 20:40:03.978868 4773 generic.go:334] "Generic (PLEG): container finished" podID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" containerID="c084c720b443d756cc911fdaf91f39912d929d0f6e679aeabcc4afc8d9674e4c" exitCode=0 Oct 12 20:40:03 crc kubenswrapper[4773]: I1012 20:40:03.978904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3","Type":"ContainerDied","Data":"c084c720b443d756cc911fdaf91f39912d929d0f6e679aeabcc4afc8d9674e4c"} Oct 12 20:40:04 crc kubenswrapper[4773]: I1012 20:40:04.987941 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36de8afd-4afa-44e6-9d8e-a6c8de0d4707","Type":"ContainerStarted","Data":"33d77ea1ef4c3a341a14af56a3ff85779969d2a2e4eb8023361463bdd0c31c51"} Oct 12 20:40:04 crc kubenswrapper[4773]: I1012 20:40:04.988801 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:40:05 crc kubenswrapper[4773]: I1012 20:40:05.004227 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3","Type":"ContainerStarted","Data":"a6909d6d8e160f1ec003b37dba4779ea8176a13b2213803ed9a6ad4c2f6c1825"} Oct 12 20:40:05 crc kubenswrapper[4773]: I1012 20:40:05.004553 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 12 20:40:05 crc kubenswrapper[4773]: I1012 20:40:05.042429 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.334091 podStartE2EDuration="1m1.042414079s" podCreationTimestamp="2025-10-12 20:39:04 +0000 UTC" firstStartedPulling="2025-10-12 20:39:06.736937854 +0000 UTC m=+894.973236414" lastFinishedPulling="2025-10-12 20:39:30.445260933 +0000 UTC m=+918.681559493" observedRunningTime="2025-10-12 20:40:05.040778514 +0000 UTC m=+953.277077074" watchObservedRunningTime="2025-10-12 20:40:05.042414079 +0000 UTC m=+953.278712639" Oct 12 20:40:05 crc kubenswrapper[4773]: I1012 20:40:05.045829 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.261418396 podStartE2EDuration="1m0.045821703s" podCreationTimestamp="2025-10-12 20:39:05 +0000 UTC" firstStartedPulling="2025-10-12 20:39:07.760815873 +0000 UTC m=+895.997114433" lastFinishedPulling="2025-10-12 20:39:30.54521918 +0000 UTC m=+918.781517740" observedRunningTime="2025-10-12 20:40:05.016177306 +0000 UTC m=+953.252475866" watchObservedRunningTime="2025-10-12 20:40:05.045821703 +0000 UTC m=+953.282120263" Oct 12 20:40:10 crc kubenswrapper[4773]: I1012 20:40:10.938811 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-sf74r" podUID="1a08bcbe-fa8c-43b2-a4fb-ae2212de940d" containerName="ovn-controller" probeResult="failure" output=< Oct 12 20:40:10 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 12 20:40:10 crc kubenswrapper[4773]: > Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.269622 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.270278 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wfpwq" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.498952 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sf74r-config-5tmcv"] Oct 12 20:40:11 crc kubenswrapper[4773]: E1012 20:40:11.499260 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ef866a-4e30-45b3-b35f-18ebf81b265d" containerName="mariadb-account-create" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.499279 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ef866a-4e30-45b3-b35f-18ebf81b265d" containerName="mariadb-account-create" Oct 12 20:40:11 crc kubenswrapper[4773]: E1012 20:40:11.499294 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c621cba-c18f-4ffd-9685-66cc229b846e" containerName="mariadb-account-create" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.499302 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c621cba-c18f-4ffd-9685-66cc229b846e" containerName="mariadb-account-create" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.499484 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ef866a-4e30-45b3-b35f-18ebf81b265d" containerName="mariadb-account-create" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.499503 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c621cba-c18f-4ffd-9685-66cc229b846e" containerName="mariadb-account-create" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.500216 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.503104 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.516629 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sf74r-config-5tmcv"] Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.570799 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-additional-scripts\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.570983 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.571103 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzj6c\" (UniqueName: \"kubernetes.io/projected/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-kube-api-access-rzj6c\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.571136 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-scripts\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.571176 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run-ovn\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.571205 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-log-ovn\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.672962 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-additional-scripts\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.673020 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.673061 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzj6c\" (UniqueName: \"kubernetes.io/projected/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-kube-api-access-rzj6c\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.673106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-scripts\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.673410 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.673137 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run-ovn\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.673481 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run-ovn\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.673591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-log-ovn\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.673698 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-log-ovn\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.673787 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-additional-scripts\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.674977 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-scripts\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.695823 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzj6c\" (UniqueName: \"kubernetes.io/projected/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-kube-api-access-rzj6c\") pod \"ovn-controller-sf74r-config-5tmcv\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:11 crc kubenswrapper[4773]: I1012 20:40:11.822572 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:14 crc kubenswrapper[4773]: I1012 20:40:14.644148 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sf74r-config-5tmcv"] Oct 12 20:40:14 crc kubenswrapper[4773]: W1012 20:40:14.657678 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b9fb50a_e942_415f_b0f8_f3ffb24d1a5c.slice/crio-6a2c223a3054fb76035333bb42d5cfbdb10184081b1f9bc3c5ca26616f666106 WatchSource:0}: Error finding container 6a2c223a3054fb76035333bb42d5cfbdb10184081b1f9bc3c5ca26616f666106: Status 404 returned error can't find the container with id 6a2c223a3054fb76035333bb42d5cfbdb10184081b1f9bc3c5ca26616f666106 Oct 12 20:40:15 crc kubenswrapper[4773]: I1012 20:40:15.085574 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dls7h" event={"ID":"8de57e72-abb2-4344-a3e3-efa878f91a88","Type":"ContainerStarted","Data":"494e3b7a7f6f75023818be96a21e5d2a29ae1f72850ce82e85ab4c99b9a719db"} Oct 12 20:40:15 crc kubenswrapper[4773]: I1012 20:40:15.088305 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sf74r-config-5tmcv" event={"ID":"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c","Type":"ContainerStarted","Data":"004096eccb5691b26964f28d37cff6f433a5fade0eb5348d8c059f600f1f818a"} Oct 12 20:40:15 crc kubenswrapper[4773]: I1012 20:40:15.088360 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sf74r-config-5tmcv" event={"ID":"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c","Type":"ContainerStarted","Data":"6a2c223a3054fb76035333bb42d5cfbdb10184081b1f9bc3c5ca26616f666106"} Oct 12 20:40:15 crc kubenswrapper[4773]: I1012 20:40:15.115305 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-dls7h" podStartSLOduration=1.9242383790000002 podStartE2EDuration="15.115283814s" podCreationTimestamp="2025-10-12 20:40:00 +0000 UTC" firstStartedPulling="2025-10-12 20:40:01.059992372 +0000 UTC m=+949.296290942" lastFinishedPulling="2025-10-12 20:40:14.251037807 +0000 UTC m=+962.487336377" observedRunningTime="2025-10-12 20:40:15.106246025 +0000 UTC m=+963.342544585" watchObservedRunningTime="2025-10-12 20:40:15.115283814 +0000 UTC m=+963.351582374" Oct 12 20:40:15 crc kubenswrapper[4773]: I1012 20:40:15.127457 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sf74r-config-5tmcv" podStartSLOduration=4.127434949 podStartE2EDuration="4.127434949s" podCreationTimestamp="2025-10-12 20:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:40:15.124313013 +0000 UTC m=+963.360611593" watchObservedRunningTime="2025-10-12 20:40:15.127434949 +0000 UTC m=+963.363733509" Oct 12 20:40:15 crc kubenswrapper[4773]: I1012 20:40:15.934142 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-sf74r" Oct 12 20:40:16 crc kubenswrapper[4773]: I1012 20:40:16.096661 4773 generic.go:334] "Generic (PLEG): container finished" podID="8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" containerID="004096eccb5691b26964f28d37cff6f433a5fade0eb5348d8c059f600f1f818a" exitCode=0 Oct 12 20:40:16 crc kubenswrapper[4773]: I1012 20:40:16.096701 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sf74r-config-5tmcv" event={"ID":"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c","Type":"ContainerDied","Data":"004096eccb5691b26964f28d37cff6f433a5fade0eb5348d8c059f600f1f818a"} Oct 12 20:40:16 crc kubenswrapper[4773]: I1012 20:40:16.371937 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 12 20:40:16 crc kubenswrapper[4773]: I1012 20:40:16.996106 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-mktjz"] Oct 12 20:40:16 crc kubenswrapper[4773]: I1012 20:40:16.997231 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mktjz" Oct 12 20:40:16 crc kubenswrapper[4773]: I1012 20:40:16.998073 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.004181 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-mktjz"] Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.056263 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc7gt\" (UniqueName: \"kubernetes.io/projected/93f29b43-b992-4bd9-8d59-11a564cace05-kube-api-access-mc7gt\") pod \"barbican-db-create-mktjz\" (UID: \"93f29b43-b992-4bd9-8d59-11a564cace05\") " pod="openstack/barbican-db-create-mktjz" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.074076 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-f7v8p"] Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.075262 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f7v8p" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.096649 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-f7v8p"] Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.157146 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwww\" (UniqueName: \"kubernetes.io/projected/a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d-kube-api-access-qlwww\") pod \"cinder-db-create-f7v8p\" (UID: \"a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d\") " pod="openstack/cinder-db-create-f7v8p" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.157198 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc7gt\" (UniqueName: \"kubernetes.io/projected/93f29b43-b992-4bd9-8d59-11a564cace05-kube-api-access-mc7gt\") pod \"barbican-db-create-mktjz\" (UID: \"93f29b43-b992-4bd9-8d59-11a564cace05\") " pod="openstack/barbican-db-create-mktjz" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.232796 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc7gt\" (UniqueName: \"kubernetes.io/projected/93f29b43-b992-4bd9-8d59-11a564cace05-kube-api-access-mc7gt\") pod \"barbican-db-create-mktjz\" (UID: \"93f29b43-b992-4bd9-8d59-11a564cace05\") " pod="openstack/barbican-db-create-mktjz" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.258229 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwww\" (UniqueName: \"kubernetes.io/projected/a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d-kube-api-access-qlwww\") pod \"cinder-db-create-f7v8p\" (UID: \"a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d\") " pod="openstack/cinder-db-create-f7v8p" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.317760 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-l87dq"] Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.318983 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.322316 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mktjz" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.329332 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.329513 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.329621 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h752b" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.329738 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.339711 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-cmc4s"] Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.341064 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cmc4s" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.346934 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwww\" (UniqueName: \"kubernetes.io/projected/a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d-kube-api-access-qlwww\") pod \"cinder-db-create-f7v8p\" (UID: \"a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d\") " pod="openstack/cinder-db-create-f7v8p" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.374859 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-l87dq"] Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.388799 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f7v8p" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.406397 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cmc4s"] Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.465887 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4tqg\" (UniqueName: \"kubernetes.io/projected/ee0c6496-049d-4a32-bf75-0c2279256bb8-kube-api-access-x4tqg\") pod \"keystone-db-sync-l87dq\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.465939 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-config-data\") pod \"keystone-db-sync-l87dq\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.466023 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-combined-ca-bundle\") pod \"keystone-db-sync-l87dq\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.466056 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5svf\" (UniqueName: \"kubernetes.io/projected/6b7606f6-4ebc-4c69-97a2-5311b014e997-kube-api-access-k5svf\") pod \"neutron-db-create-cmc4s\" (UID: \"6b7606f6-4ebc-4c69-97a2-5311b014e997\") " pod="openstack/neutron-db-create-cmc4s" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.568925 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-combined-ca-bundle\") pod \"keystone-db-sync-l87dq\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.569982 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5svf\" (UniqueName: \"kubernetes.io/projected/6b7606f6-4ebc-4c69-97a2-5311b014e997-kube-api-access-k5svf\") pod \"neutron-db-create-cmc4s\" (UID: \"6b7606f6-4ebc-4c69-97a2-5311b014e997\") " pod="openstack/neutron-db-create-cmc4s" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.570089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4tqg\" (UniqueName: \"kubernetes.io/projected/ee0c6496-049d-4a32-bf75-0c2279256bb8-kube-api-access-x4tqg\") pod \"keystone-db-sync-l87dq\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.575485 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-combined-ca-bundle\") pod \"keystone-db-sync-l87dq\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.578934 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-config-data\") pod \"keystone-db-sync-l87dq\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.590268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-config-data\") pod \"keystone-db-sync-l87dq\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.604416 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4tqg\" (UniqueName: \"kubernetes.io/projected/ee0c6496-049d-4a32-bf75-0c2279256bb8-kube-api-access-x4tqg\") pod \"keystone-db-sync-l87dq\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.603357 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5svf\" (UniqueName: \"kubernetes.io/projected/6b7606f6-4ebc-4c69-97a2-5311b014e997-kube-api-access-k5svf\") pod \"neutron-db-create-cmc4s\" (UID: \"6b7606f6-4ebc-4c69-97a2-5311b014e997\") " pod="openstack/neutron-db-create-cmc4s" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.687046 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.781338 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-scripts\") pod \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.781431 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzj6c\" (UniqueName: \"kubernetes.io/projected/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-kube-api-access-rzj6c\") pod \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.781478 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run\") pod \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.781513 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-additional-scripts\") pod \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.781538 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-log-ovn\") pod \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.781585 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run-ovn\") pod \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\" (UID: \"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c\") " Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.781916 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" (UID: "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.783016 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-scripts" (OuterVolumeSpecName: "scripts") pod "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" (UID: "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.783530 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" (UID: "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.783610 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run" (OuterVolumeSpecName: "var-run") pod "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" (UID: "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.784407 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" (UID: "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.790583 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-kube-api-access-rzj6c" (OuterVolumeSpecName: "kube-api-access-rzj6c") pod "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" (UID: "8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c"). InnerVolumeSpecName "kube-api-access-rzj6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.792074 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.815514 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cmc4s" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.885002 4773 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.885285 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.885294 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzj6c\" (UniqueName: \"kubernetes.io/projected/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-kube-api-access-rzj6c\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.885306 4773 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-run\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.885314 4773 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:17 crc kubenswrapper[4773]: I1012 20:40:17.885326 4773 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.007440 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-mktjz"] Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.114142 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sf74r-config-5tmcv" event={"ID":"8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c","Type":"ContainerDied","Data":"6a2c223a3054fb76035333bb42d5cfbdb10184081b1f9bc3c5ca26616f666106"} Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.114359 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a2c223a3054fb76035333bb42d5cfbdb10184081b1f9bc3c5ca26616f666106" Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.114426 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sf74r-config-5tmcv" Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.126441 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mktjz" event={"ID":"93f29b43-b992-4bd9-8d59-11a564cace05","Type":"ContainerStarted","Data":"19b4145ff9b32ff21715b801d6f75d8ef3c98c9e6ed0da76ee9e59fef0a4a3b1"} Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.167945 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-f7v8p"] Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.387435 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-l87dq"] Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.412622 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cmc4s"] Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.849222 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sf74r-config-5tmcv"] Oct 12 20:40:18 crc kubenswrapper[4773]: I1012 20:40:18.862511 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sf74r-config-5tmcv"] Oct 12 20:40:19 crc kubenswrapper[4773]: I1012 20:40:19.147498 4773 generic.go:334] "Generic (PLEG): container finished" podID="6b7606f6-4ebc-4c69-97a2-5311b014e997" containerID="394cbccec20d06ae5450630aadbd87f5192dcbcc3e819130318b67258e22f447" exitCode=0 Oct 12 20:40:19 crc kubenswrapper[4773]: I1012 20:40:19.147599 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cmc4s" event={"ID":"6b7606f6-4ebc-4c69-97a2-5311b014e997","Type":"ContainerDied","Data":"394cbccec20d06ae5450630aadbd87f5192dcbcc3e819130318b67258e22f447"} Oct 12 20:40:19 crc kubenswrapper[4773]: I1012 20:40:19.147625 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cmc4s" event={"ID":"6b7606f6-4ebc-4c69-97a2-5311b014e997","Type":"ContainerStarted","Data":"b17349238089b51d306d154f0c05989c123af7dac434a83b2fe4e57e2cdd6f5e"} Oct 12 20:40:19 crc kubenswrapper[4773]: I1012 20:40:19.149429 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l87dq" event={"ID":"ee0c6496-049d-4a32-bf75-0c2279256bb8","Type":"ContainerStarted","Data":"a527d39a887086483b181d3274d4e335f69887379d5af4080f70527f074581de"} Oct 12 20:40:19 crc kubenswrapper[4773]: I1012 20:40:19.155207 4773 generic.go:334] "Generic (PLEG): container finished" podID="a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d" containerID="825074158d43d7622cb800e3a0f8fb953b8c13e08a88cd89e2a28316ab093879" exitCode=0 Oct 12 20:40:19 crc kubenswrapper[4773]: I1012 20:40:19.155280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f7v8p" event={"ID":"a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d","Type":"ContainerDied","Data":"825074158d43d7622cb800e3a0f8fb953b8c13e08a88cd89e2a28316ab093879"} Oct 12 20:40:19 crc kubenswrapper[4773]: I1012 20:40:19.155304 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f7v8p" event={"ID":"a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d","Type":"ContainerStarted","Data":"5778e3ba22b1e28b7ba9d5f597b624da6eb369c5ccaf6caada2c527bab46208b"} Oct 12 20:40:19 crc kubenswrapper[4773]: I1012 20:40:19.161014 4773 generic.go:334] "Generic (PLEG): container finished" podID="93f29b43-b992-4bd9-8d59-11a564cace05" containerID="4c4085a1ba02be8f6748a40bbfd8c6bb8a24468f58938786cdbbc9ad8ffe3c30" exitCode=0 Oct 12 20:40:19 crc kubenswrapper[4773]: I1012 20:40:19.161074 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mktjz" event={"ID":"93f29b43-b992-4bd9-8d59-11a564cace05","Type":"ContainerDied","Data":"4c4085a1ba02be8f6748a40bbfd8c6bb8a24468f58938786cdbbc9ad8ffe3c30"} Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.502323 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" path="/var/lib/kubelet/pods/8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c/volumes" Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.651232 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f7v8p" Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.679479 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlwww\" (UniqueName: \"kubernetes.io/projected/a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d-kube-api-access-qlwww\") pod \"a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d\" (UID: \"a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d\") " Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.722931 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d-kube-api-access-qlwww" (OuterVolumeSpecName: "kube-api-access-qlwww") pod "a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d" (UID: "a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d"). InnerVolumeSpecName "kube-api-access-qlwww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.786243 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlwww\" (UniqueName: \"kubernetes.io/projected/a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d-kube-api-access-qlwww\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.792216 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cmc4s" Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.796680 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mktjz" Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.888385 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5svf\" (UniqueName: \"kubernetes.io/projected/6b7606f6-4ebc-4c69-97a2-5311b014e997-kube-api-access-k5svf\") pod \"6b7606f6-4ebc-4c69-97a2-5311b014e997\" (UID: \"6b7606f6-4ebc-4c69-97a2-5311b014e997\") " Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.888450 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc7gt\" (UniqueName: \"kubernetes.io/projected/93f29b43-b992-4bd9-8d59-11a564cace05-kube-api-access-mc7gt\") pod \"93f29b43-b992-4bd9-8d59-11a564cace05\" (UID: \"93f29b43-b992-4bd9-8d59-11a564cace05\") " Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.891560 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f29b43-b992-4bd9-8d59-11a564cace05-kube-api-access-mc7gt" (OuterVolumeSpecName: "kube-api-access-mc7gt") pod "93f29b43-b992-4bd9-8d59-11a564cace05" (UID: "93f29b43-b992-4bd9-8d59-11a564cace05"). InnerVolumeSpecName "kube-api-access-mc7gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.892265 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b7606f6-4ebc-4c69-97a2-5311b014e997-kube-api-access-k5svf" (OuterVolumeSpecName: "kube-api-access-k5svf") pod "6b7606f6-4ebc-4c69-97a2-5311b014e997" (UID: "6b7606f6-4ebc-4c69-97a2-5311b014e997"). InnerVolumeSpecName "kube-api-access-k5svf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.990871 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5svf\" (UniqueName: \"kubernetes.io/projected/6b7606f6-4ebc-4c69-97a2-5311b014e997-kube-api-access-k5svf\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:20 crc kubenswrapper[4773]: I1012 20:40:20.990906 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc7gt\" (UniqueName: \"kubernetes.io/projected/93f29b43-b992-4bd9-8d59-11a564cace05-kube-api-access-mc7gt\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:21 crc kubenswrapper[4773]: I1012 20:40:21.174850 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cmc4s" event={"ID":"6b7606f6-4ebc-4c69-97a2-5311b014e997","Type":"ContainerDied","Data":"b17349238089b51d306d154f0c05989c123af7dac434a83b2fe4e57e2cdd6f5e"} Oct 12 20:40:21 crc kubenswrapper[4773]: I1012 20:40:21.174882 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cmc4s" Oct 12 20:40:21 crc kubenswrapper[4773]: I1012 20:40:21.174906 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b17349238089b51d306d154f0c05989c123af7dac434a83b2fe4e57e2cdd6f5e" Oct 12 20:40:21 crc kubenswrapper[4773]: I1012 20:40:21.176357 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f7v8p" Oct 12 20:40:21 crc kubenswrapper[4773]: I1012 20:40:21.176377 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f7v8p" event={"ID":"a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d","Type":"ContainerDied","Data":"5778e3ba22b1e28b7ba9d5f597b624da6eb369c5ccaf6caada2c527bab46208b"} Oct 12 20:40:21 crc kubenswrapper[4773]: I1012 20:40:21.176415 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5778e3ba22b1e28b7ba9d5f597b624da6eb369c5ccaf6caada2c527bab46208b" Oct 12 20:40:21 crc kubenswrapper[4773]: I1012 20:40:21.178047 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mktjz" event={"ID":"93f29b43-b992-4bd9-8d59-11a564cace05","Type":"ContainerDied","Data":"19b4145ff9b32ff21715b801d6f75d8ef3c98c9e6ed0da76ee9e59fef0a4a3b1"} Oct 12 20:40:21 crc kubenswrapper[4773]: I1012 20:40:21.178084 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b4145ff9b32ff21715b801d6f75d8ef3c98c9e6ed0da76ee9e59fef0a4a3b1" Oct 12 20:40:21 crc kubenswrapper[4773]: I1012 20:40:21.178131 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mktjz" Oct 12 20:40:25 crc kubenswrapper[4773]: I1012 20:40:25.209542 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l87dq" event={"ID":"ee0c6496-049d-4a32-bf75-0c2279256bb8","Type":"ContainerStarted","Data":"f6789c82842b479a0f948ae362c648310cd41aa0ca996e30400d4a69eb2c3a18"} Oct 12 20:40:25 crc kubenswrapper[4773]: I1012 20:40:25.226902 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-l87dq" podStartSLOduration=2.247885541 podStartE2EDuration="8.226887586s" podCreationTimestamp="2025-10-12 20:40:17 +0000 UTC" firstStartedPulling="2025-10-12 20:40:18.425034498 +0000 UTC m=+966.661333048" lastFinishedPulling="2025-10-12 20:40:24.404036533 +0000 UTC m=+972.640335093" observedRunningTime="2025-10-12 20:40:25.224045518 +0000 UTC m=+973.460344078" watchObservedRunningTime="2025-10-12 20:40:25.226887586 +0000 UTC m=+973.463186146" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.037308 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-93ce-account-create-2vmdf"] Oct 12 20:40:27 crc kubenswrapper[4773]: E1012 20:40:27.037698 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" containerName="ovn-config" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.037713 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" containerName="ovn-config" Oct 12 20:40:27 crc kubenswrapper[4773]: E1012 20:40:27.037748 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f29b43-b992-4bd9-8d59-11a564cace05" containerName="mariadb-database-create" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.037756 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f29b43-b992-4bd9-8d59-11a564cace05" containerName="mariadb-database-create" Oct 12 20:40:27 crc kubenswrapper[4773]: E1012 20:40:27.037771 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7606f6-4ebc-4c69-97a2-5311b014e997" containerName="mariadb-database-create" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.037779 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7606f6-4ebc-4c69-97a2-5311b014e997" containerName="mariadb-database-create" Oct 12 20:40:27 crc kubenswrapper[4773]: E1012 20:40:27.037800 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d" containerName="mariadb-database-create" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.037807 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d" containerName="mariadb-database-create" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.037969 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9fb50a-e942-415f-b0f8-f3ffb24d1a5c" containerName="ovn-config" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.037990 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f29b43-b992-4bd9-8d59-11a564cace05" containerName="mariadb-database-create" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.037999 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7606f6-4ebc-4c69-97a2-5311b014e997" containerName="mariadb-database-create" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.038015 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d" containerName="mariadb-database-create" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.038695 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-93ce-account-create-2vmdf" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.042017 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.052431 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-93ce-account-create-2vmdf"] Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.187425 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5x2\" (UniqueName: \"kubernetes.io/projected/5d290bcb-24cb-4bf6-87ac-da388aca2948-kube-api-access-7t5x2\") pod \"barbican-93ce-account-create-2vmdf\" (UID: \"5d290bcb-24cb-4bf6-87ac-da388aca2948\") " pod="openstack/barbican-93ce-account-create-2vmdf" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.234585 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-056c-account-create-w69zk"] Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.235519 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-056c-account-create-w69zk" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.238665 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.248958 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-056c-account-create-w69zk"] Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.288854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5x2\" (UniqueName: \"kubernetes.io/projected/5d290bcb-24cb-4bf6-87ac-da388aca2948-kube-api-access-7t5x2\") pod \"barbican-93ce-account-create-2vmdf\" (UID: \"5d290bcb-24cb-4bf6-87ac-da388aca2948\") " pod="openstack/barbican-93ce-account-create-2vmdf" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.307813 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5x2\" (UniqueName: \"kubernetes.io/projected/5d290bcb-24cb-4bf6-87ac-da388aca2948-kube-api-access-7t5x2\") pod \"barbican-93ce-account-create-2vmdf\" (UID: \"5d290bcb-24cb-4bf6-87ac-da388aca2948\") " pod="openstack/barbican-93ce-account-create-2vmdf" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.356469 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-296b-account-create-rjf48"] Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.357385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-296b-account-create-rjf48" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.361441 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.367243 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-296b-account-create-rjf48"] Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.390682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65hsd\" (UniqueName: \"kubernetes.io/projected/97cc8725-7f9c-428d-8591-43183469d84b-kube-api-access-65hsd\") pod \"cinder-056c-account-create-w69zk\" (UID: \"97cc8725-7f9c-428d-8591-43183469d84b\") " pod="openstack/cinder-056c-account-create-w69zk" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.394106 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-93ce-account-create-2vmdf" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.496503 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx7z2\" (UniqueName: \"kubernetes.io/projected/c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47-kube-api-access-jx7z2\") pod \"neutron-296b-account-create-rjf48\" (UID: \"c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47\") " pod="openstack/neutron-296b-account-create-rjf48" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.496757 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65hsd\" (UniqueName: \"kubernetes.io/projected/97cc8725-7f9c-428d-8591-43183469d84b-kube-api-access-65hsd\") pod \"cinder-056c-account-create-w69zk\" (UID: \"97cc8725-7f9c-428d-8591-43183469d84b\") " pod="openstack/cinder-056c-account-create-w69zk" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.532353 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65hsd\" (UniqueName: \"kubernetes.io/projected/97cc8725-7f9c-428d-8591-43183469d84b-kube-api-access-65hsd\") pod \"cinder-056c-account-create-w69zk\" (UID: \"97cc8725-7f9c-428d-8591-43183469d84b\") " pod="openstack/cinder-056c-account-create-w69zk" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.561035 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-056c-account-create-w69zk" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.597864 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx7z2\" (UniqueName: \"kubernetes.io/projected/c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47-kube-api-access-jx7z2\") pod \"neutron-296b-account-create-rjf48\" (UID: \"c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47\") " pod="openstack/neutron-296b-account-create-rjf48" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.616043 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx7z2\" (UniqueName: \"kubernetes.io/projected/c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47-kube-api-access-jx7z2\") pod \"neutron-296b-account-create-rjf48\" (UID: \"c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47\") " pod="openstack/neutron-296b-account-create-rjf48" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.678479 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-296b-account-create-rjf48" Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.852905 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-93ce-account-create-2vmdf"] Oct 12 20:40:27 crc kubenswrapper[4773]: W1012 20:40:27.855862 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d290bcb_24cb_4bf6_87ac_da388aca2948.slice/crio-564ed339b7f105abacc88438eab2296bd878bfa19a64d5d4b6afa87735542f43 WatchSource:0}: Error finding container 564ed339b7f105abacc88438eab2296bd878bfa19a64d5d4b6afa87735542f43: Status 404 returned error can't find the container with id 564ed339b7f105abacc88438eab2296bd878bfa19a64d5d4b6afa87735542f43 Oct 12 20:40:27 crc kubenswrapper[4773]: I1012 20:40:27.987182 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-056c-account-create-w69zk"] Oct 12 20:40:27 crc kubenswrapper[4773]: W1012 20:40:27.991939 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97cc8725_7f9c_428d_8591_43183469d84b.slice/crio-f2890ab5fa036975dc4a42001cdfd518dbbf3cb6174cefb31f9235773ba2a939 WatchSource:0}: Error finding container f2890ab5fa036975dc4a42001cdfd518dbbf3cb6174cefb31f9235773ba2a939: Status 404 returned error can't find the container with id f2890ab5fa036975dc4a42001cdfd518dbbf3cb6174cefb31f9235773ba2a939 Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.105690 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-296b-account-create-rjf48"] Oct 12 20:40:28 crc kubenswrapper[4773]: W1012 20:40:28.105813 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3e09ae6_1c69_4aeb_8c35_1d560c9f7d47.slice/crio-2380e3cb2236012eb27d3d590a477d4eedf977f1f2feb6aa2d2fd37bbd2d053b WatchSource:0}: Error finding container 2380e3cb2236012eb27d3d590a477d4eedf977f1f2feb6aa2d2fd37bbd2d053b: Status 404 returned error can't find the container with id 2380e3cb2236012eb27d3d590a477d4eedf977f1f2feb6aa2d2fd37bbd2d053b Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.241889 4773 generic.go:334] "Generic (PLEG): container finished" podID="8de57e72-abb2-4344-a3e3-efa878f91a88" containerID="494e3b7a7f6f75023818be96a21e5d2a29ae1f72850ce82e85ab4c99b9a719db" exitCode=0 Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.241991 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dls7h" event={"ID":"8de57e72-abb2-4344-a3e3-efa878f91a88","Type":"ContainerDied","Data":"494e3b7a7f6f75023818be96a21e5d2a29ae1f72850ce82e85ab4c99b9a719db"} Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.243548 4773 generic.go:334] "Generic (PLEG): container finished" podID="ee0c6496-049d-4a32-bf75-0c2279256bb8" containerID="f6789c82842b479a0f948ae362c648310cd41aa0ca996e30400d4a69eb2c3a18" exitCode=0 Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.243624 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l87dq" event={"ID":"ee0c6496-049d-4a32-bf75-0c2279256bb8","Type":"ContainerDied","Data":"f6789c82842b479a0f948ae362c648310cd41aa0ca996e30400d4a69eb2c3a18"} Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.245064 4773 generic.go:334] "Generic (PLEG): container finished" podID="5d290bcb-24cb-4bf6-87ac-da388aca2948" containerID="df8bfcd302d8db5bb280b9d9a1d3ed49e8a052f4921c14be8b6926bc2e4c0316" exitCode=0 Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.245229 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-93ce-account-create-2vmdf" event={"ID":"5d290bcb-24cb-4bf6-87ac-da388aca2948","Type":"ContainerDied","Data":"df8bfcd302d8db5bb280b9d9a1d3ed49e8a052f4921c14be8b6926bc2e4c0316"} Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.245847 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-93ce-account-create-2vmdf" event={"ID":"5d290bcb-24cb-4bf6-87ac-da388aca2948","Type":"ContainerStarted","Data":"564ed339b7f105abacc88438eab2296bd878bfa19a64d5d4b6afa87735542f43"} Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.246669 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-296b-account-create-rjf48" event={"ID":"c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47","Type":"ContainerStarted","Data":"2380e3cb2236012eb27d3d590a477d4eedf977f1f2feb6aa2d2fd37bbd2d053b"} Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.248019 4773 generic.go:334] "Generic (PLEG): container finished" podID="97cc8725-7f9c-428d-8591-43183469d84b" containerID="2beb93ab4a1f4f0a90f30e313d95da7a4165f8a006e592b23e66ed91e27b7c2e" exitCode=0 Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.248131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-056c-account-create-w69zk" event={"ID":"97cc8725-7f9c-428d-8591-43183469d84b","Type":"ContainerDied","Data":"2beb93ab4a1f4f0a90f30e313d95da7a4165f8a006e592b23e66ed91e27b7c2e"} Oct 12 20:40:28 crc kubenswrapper[4773]: I1012 20:40:28.248207 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-056c-account-create-w69zk" event={"ID":"97cc8725-7f9c-428d-8591-43183469d84b","Type":"ContainerStarted","Data":"f2890ab5fa036975dc4a42001cdfd518dbbf3cb6174cefb31f9235773ba2a939"} Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.261234 4773 generic.go:334] "Generic (PLEG): container finished" podID="c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47" containerID="185670ff40c99036f948026f9a33d09d50c600480f3418a5fb4a9b3a0b231e9a" exitCode=0 Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.261352 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-296b-account-create-rjf48" event={"ID":"c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47","Type":"ContainerDied","Data":"185670ff40c99036f948026f9a33d09d50c600480f3418a5fb4a9b3a0b231e9a"} Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.786542 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.795546 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-056c-account-create-w69zk" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.803346 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-93ce-account-create-2vmdf" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.811336 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.936880 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-db-sync-config-data\") pod \"8de57e72-abb2-4344-a3e3-efa878f91a88\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.937233 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t5x2\" (UniqueName: \"kubernetes.io/projected/5d290bcb-24cb-4bf6-87ac-da388aca2948-kube-api-access-7t5x2\") pod \"5d290bcb-24cb-4bf6-87ac-da388aca2948\" (UID: \"5d290bcb-24cb-4bf6-87ac-da388aca2948\") " Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.937390 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-combined-ca-bundle\") pod \"8de57e72-abb2-4344-a3e3-efa878f91a88\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.937463 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-config-data\") pod \"8de57e72-abb2-4344-a3e3-efa878f91a88\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.937560 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65hsd\" (UniqueName: \"kubernetes.io/projected/97cc8725-7f9c-428d-8591-43183469d84b-kube-api-access-65hsd\") pod \"97cc8725-7f9c-428d-8591-43183469d84b\" (UID: \"97cc8725-7f9c-428d-8591-43183469d84b\") " Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.938242 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4tqg\" (UniqueName: \"kubernetes.io/projected/ee0c6496-049d-4a32-bf75-0c2279256bb8-kube-api-access-x4tqg\") pod \"ee0c6496-049d-4a32-bf75-0c2279256bb8\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.938347 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-combined-ca-bundle\") pod \"ee0c6496-049d-4a32-bf75-0c2279256bb8\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.938416 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrzqd\" (UniqueName: \"kubernetes.io/projected/8de57e72-abb2-4344-a3e3-efa878f91a88-kube-api-access-qrzqd\") pod \"8de57e72-abb2-4344-a3e3-efa878f91a88\" (UID: \"8de57e72-abb2-4344-a3e3-efa878f91a88\") " Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.938558 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-config-data\") pod \"ee0c6496-049d-4a32-bf75-0c2279256bb8\" (UID: \"ee0c6496-049d-4a32-bf75-0c2279256bb8\") " Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.943244 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0c6496-049d-4a32-bf75-0c2279256bb8-kube-api-access-x4tqg" (OuterVolumeSpecName: "kube-api-access-x4tqg") pod "ee0c6496-049d-4a32-bf75-0c2279256bb8" (UID: "ee0c6496-049d-4a32-bf75-0c2279256bb8"). InnerVolumeSpecName "kube-api-access-x4tqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.943423 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97cc8725-7f9c-428d-8591-43183469d84b-kube-api-access-65hsd" (OuterVolumeSpecName: "kube-api-access-65hsd") pod "97cc8725-7f9c-428d-8591-43183469d84b" (UID: "97cc8725-7f9c-428d-8591-43183469d84b"). InnerVolumeSpecName "kube-api-access-65hsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.943484 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d290bcb-24cb-4bf6-87ac-da388aca2948-kube-api-access-7t5x2" (OuterVolumeSpecName: "kube-api-access-7t5x2") pod "5d290bcb-24cb-4bf6-87ac-da388aca2948" (UID: "5d290bcb-24cb-4bf6-87ac-da388aca2948"). InnerVolumeSpecName "kube-api-access-7t5x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.945626 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8de57e72-abb2-4344-a3e3-efa878f91a88" (UID: "8de57e72-abb2-4344-a3e3-efa878f91a88"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.946086 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de57e72-abb2-4344-a3e3-efa878f91a88-kube-api-access-qrzqd" (OuterVolumeSpecName: "kube-api-access-qrzqd") pod "8de57e72-abb2-4344-a3e3-efa878f91a88" (UID: "8de57e72-abb2-4344-a3e3-efa878f91a88"). InnerVolumeSpecName "kube-api-access-qrzqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.966201 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8de57e72-abb2-4344-a3e3-efa878f91a88" (UID: "8de57e72-abb2-4344-a3e3-efa878f91a88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.968231 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee0c6496-049d-4a32-bf75-0c2279256bb8" (UID: "ee0c6496-049d-4a32-bf75-0c2279256bb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.979912 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-config-data" (OuterVolumeSpecName: "config-data") pod "ee0c6496-049d-4a32-bf75-0c2279256bb8" (UID: "ee0c6496-049d-4a32-bf75-0c2279256bb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:29 crc kubenswrapper[4773]: I1012 20:40:29.980741 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-config-data" (OuterVolumeSpecName: "config-data") pod "8de57e72-abb2-4344-a3e3-efa878f91a88" (UID: "8de57e72-abb2-4344-a3e3-efa878f91a88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.041658 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.041732 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.041754 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65hsd\" (UniqueName: \"kubernetes.io/projected/97cc8725-7f9c-428d-8591-43183469d84b-kube-api-access-65hsd\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.041775 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4tqg\" (UniqueName: \"kubernetes.io/projected/ee0c6496-049d-4a32-bf75-0c2279256bb8-kube-api-access-x4tqg\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.041792 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.041809 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrzqd\" (UniqueName: \"kubernetes.io/projected/8de57e72-abb2-4344-a3e3-efa878f91a88-kube-api-access-qrzqd\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.041829 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee0c6496-049d-4a32-bf75-0c2279256bb8-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.041847 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de57e72-abb2-4344-a3e3-efa878f91a88-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.041863 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t5x2\" (UniqueName: \"kubernetes.io/projected/5d290bcb-24cb-4bf6-87ac-da388aca2948-kube-api-access-7t5x2\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.301843 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-056c-account-create-w69zk" event={"ID":"97cc8725-7f9c-428d-8591-43183469d84b","Type":"ContainerDied","Data":"f2890ab5fa036975dc4a42001cdfd518dbbf3cb6174cefb31f9235773ba2a939"} Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.301903 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2890ab5fa036975dc4a42001cdfd518dbbf3cb6174cefb31f9235773ba2a939" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.301989 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-056c-account-create-w69zk" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.315522 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dls7h" event={"ID":"8de57e72-abb2-4344-a3e3-efa878f91a88","Type":"ContainerDied","Data":"50546bb7b02699efe7dc6c4eeab3c9457ca46d9e5d85ccbfe68d798cb59f10fc"} Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.315564 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50546bb7b02699efe7dc6c4eeab3c9457ca46d9e5d85ccbfe68d798cb59f10fc" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.315633 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dls7h" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.323902 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l87dq" event={"ID":"ee0c6496-049d-4a32-bf75-0c2279256bb8","Type":"ContainerDied","Data":"a527d39a887086483b181d3274d4e335f69887379d5af4080f70527f074581de"} Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.323959 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a527d39a887086483b181d3274d4e335f69887379d5af4080f70527f074581de" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.323930 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l87dq" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.326338 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-93ce-account-create-2vmdf" event={"ID":"5d290bcb-24cb-4bf6-87ac-da388aca2948","Type":"ContainerDied","Data":"564ed339b7f105abacc88438eab2296bd878bfa19a64d5d4b6afa87735542f43"} Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.326469 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="564ed339b7f105abacc88438eab2296bd878bfa19a64d5d4b6afa87735542f43" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.326449 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-93ce-account-create-2vmdf" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.546295 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd"] Oct 12 20:40:30 crc kubenswrapper[4773]: E1012 20:40:30.546875 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d290bcb-24cb-4bf6-87ac-da388aca2948" containerName="mariadb-account-create" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.546890 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d290bcb-24cb-4bf6-87ac-da388aca2948" containerName="mariadb-account-create" Oct 12 20:40:30 crc kubenswrapper[4773]: E1012 20:40:30.546921 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de57e72-abb2-4344-a3e3-efa878f91a88" containerName="glance-db-sync" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.546928 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de57e72-abb2-4344-a3e3-efa878f91a88" containerName="glance-db-sync" Oct 12 20:40:30 crc kubenswrapper[4773]: E1012 20:40:30.546938 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0c6496-049d-4a32-bf75-0c2279256bb8" containerName="keystone-db-sync" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.546945 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0c6496-049d-4a32-bf75-0c2279256bb8" containerName="keystone-db-sync" Oct 12 20:40:30 crc kubenswrapper[4773]: E1012 20:40:30.546957 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97cc8725-7f9c-428d-8591-43183469d84b" containerName="mariadb-account-create" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.546962 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cc8725-7f9c-428d-8591-43183469d84b" containerName="mariadb-account-create" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.547100 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de57e72-abb2-4344-a3e3-efa878f91a88" containerName="glance-db-sync" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.547112 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d290bcb-24cb-4bf6-87ac-da388aca2948" containerName="mariadb-account-create" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.547126 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="97cc8725-7f9c-428d-8591-43183469d84b" containerName="mariadb-account-create" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.547135 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0c6496-049d-4a32-bf75-0c2279256bb8" containerName="keystone-db-sync" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.549729 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.557256 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd"] Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.576735 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-42mn8"] Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.577764 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.598314 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.598834 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h752b" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.598985 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.599112 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.616853 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-42mn8"] Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.649876 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.649964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.650040 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf2fb\" (UniqueName: \"kubernetes.io/projected/2df370d1-0a6c-47bf-b358-bd285d8416fc-kube-api-access-bf2fb\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.650087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-config\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.650106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-dns-svc\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.755481 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-credential-keys\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.755530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtpmz\" (UniqueName: \"kubernetes.io/projected/d8fac23f-442a-42f7-8fcf-f026c88b0286-kube-api-access-mtpmz\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.759012 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-fernet-keys\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.759062 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.759087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-config-data\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.759142 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf2fb\" (UniqueName: \"kubernetes.io/projected/2df370d1-0a6c-47bf-b358-bd285d8416fc-kube-api-access-bf2fb\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.759183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-config\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.759200 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-dns-svc\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.759278 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-scripts\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.759300 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-combined-ca-bundle\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.759317 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.760214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.760796 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.761640 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-config\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.762215 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-dns-svc\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.825147 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf2fb\" (UniqueName: \"kubernetes.io/projected/2df370d1-0a6c-47bf-b358-bd285d8416fc-kube-api-access-bf2fb\") pod \"dnsmasq-dns-5dcb7bb4dc-pqpsd\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.863022 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-credential-keys\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.863076 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtpmz\" (UniqueName: \"kubernetes.io/projected/d8fac23f-442a-42f7-8fcf-f026c88b0286-kube-api-access-mtpmz\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.863108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-fernet-keys\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.863141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-config-data\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.863223 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-scripts\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.863248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-combined-ca-bundle\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.869268 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.870059 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-combined-ca-bundle\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.873252 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-config-data\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.875750 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-scripts\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.876258 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-fernet-keys\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:30 crc kubenswrapper[4773]: I1012 20:40:30.927053 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-credential-keys\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.001753 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-296b-account-create-rjf48" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.049613 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtpmz\" (UniqueName: \"kubernetes.io/projected/d8fac23f-442a-42f7-8fcf-f026c88b0286-kube-api-access-mtpmz\") pod \"keystone-bootstrap-42mn8\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.055078 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:40:31 crc kubenswrapper[4773]: E1012 20:40:31.061036 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47" containerName="mariadb-account-create" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.061062 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47" containerName="mariadb-account-create" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.063302 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47" containerName="mariadb-account-create" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.077345 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.079822 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd"] Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.085819 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.086049 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.107448 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.136115 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c7d6ff65f-pcr5q"] Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.137508 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.180330 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx7z2\" (UniqueName: \"kubernetes.io/projected/c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47-kube-api-access-jx7z2\") pod \"c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47\" (UID: \"c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47\") " Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.180643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-scripts\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.180694 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rkq8\" (UniqueName: \"kubernetes.io/projected/a202db6a-83d7-461f-8258-618d63c95bbf-kube-api-access-7rkq8\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.180736 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-config-data\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.180768 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.180782 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-run-httpd\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.181029 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.181095 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-log-httpd\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.191783 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7d6ff65f-pcr5q"] Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.204971 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47-kube-api-access-jx7z2" (OuterVolumeSpecName: "kube-api-access-jx7z2") pod "c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47" (UID: "c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47"). InnerVolumeSpecName "kube-api-access-jx7z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.220614 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hp22p"] Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.221826 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.256370 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hp22p"] Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.259125 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tczl6" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.259502 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.259759 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283498 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-scripts\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-sb\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283573 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-config\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283605 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rkq8\" (UniqueName: \"kubernetes.io/projected/a202db6a-83d7-461f-8258-618d63c95bbf-kube-api-access-7rkq8\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283628 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-config-data\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283658 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-run-httpd\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283701 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-nb\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283741 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-dns-svc\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283770 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-log-httpd\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.283824 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbvwn\" (UniqueName: \"kubernetes.io/projected/9ed6074d-7cb4-4818-82cc-023f4778fcc9-kube-api-access-tbvwn\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.286990 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.287120 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx7z2\" (UniqueName: \"kubernetes.io/projected/c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47-kube-api-access-jx7z2\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.289233 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-run-httpd\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.295918 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.298059 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-log-httpd\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.309376 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7d6ff65f-pcr5q"] Oct 12 20:40:31 crc kubenswrapper[4773]: E1012 20:40:31.310096 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-tbvwn ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" podUID="9ed6074d-7cb4-4818-82cc-023f4778fcc9" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.311065 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-scripts\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.311121 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.312149 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-config-data\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.337019 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rkq8\" (UniqueName: \"kubernetes.io/projected/a202db6a-83d7-461f-8258-618d63c95bbf-kube-api-access-7rkq8\") pod \"ceilometer-0\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.390532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-sb\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.390581 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-config\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.390627 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-config-data\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.390654 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-logs\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.390675 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-scripts\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.390697 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-nb\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.390733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-dns-svc\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.390757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-combined-ca-bundle\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.390778 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fw9c\" (UniqueName: \"kubernetes.io/projected/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-kube-api-access-9fw9c\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.391750 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-sb\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.392295 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-nb\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.392299 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-config\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.392426 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbvwn\" (UniqueName: \"kubernetes.io/projected/9ed6074d-7cb4-4818-82cc-023f4778fcc9-kube-api-access-tbvwn\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.392837 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-dns-svc\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.392861 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-748d7644cf-8wxm8"] Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.409379 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.422103 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.442143 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.442158 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-296b-account-create-rjf48" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.442200 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-296b-account-create-rjf48" event={"ID":"c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47","Type":"ContainerDied","Data":"2380e3cb2236012eb27d3d590a477d4eedf977f1f2feb6aa2d2fd37bbd2d053b"} Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.442226 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2380e3cb2236012eb27d3d590a477d4eedf977f1f2feb6aa2d2fd37bbd2d053b" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.445233 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbvwn\" (UniqueName: \"kubernetes.io/projected/9ed6074d-7cb4-4818-82cc-023f4778fcc9-kube-api-access-tbvwn\") pod \"dnsmasq-dns-c7d6ff65f-pcr5q\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.450897 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-748d7644cf-8wxm8"] Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.493366 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-config-data\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.502292 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-logs\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.503301 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-scripts\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.503872 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-combined-ca-bundle\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.504533 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fw9c\" (UniqueName: \"kubernetes.io/projected/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-kube-api-access-9fw9c\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.498927 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.503129 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-logs\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.506962 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-config-data\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.521107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-combined-ca-bundle\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.534074 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-scripts\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.535333 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fw9c\" (UniqueName: \"kubernetes.io/projected/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-kube-api-access-9fw9c\") pod \"placement-db-sync-hp22p\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.586068 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hp22p" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.605760 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-dns-svc\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.606250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-config\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.606419 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-nb\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.606523 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd4sc\" (UniqueName: \"kubernetes.io/projected/ebb34728-7bbe-4e28-ad43-e8913ce25a30-kube-api-access-pd4sc\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.606602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-sb\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.695595 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd"] Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.707409 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-config\") pod \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.707679 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-nb\") pod \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.707705 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-sb\") pod \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.707776 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-dns-svc\") pod \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.707839 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbvwn\" (UniqueName: \"kubernetes.io/projected/9ed6074d-7cb4-4818-82cc-023f4778fcc9-kube-api-access-tbvwn\") pod \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\" (UID: \"9ed6074d-7cb4-4818-82cc-023f4778fcc9\") " Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.708023 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-dns-svc\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.708069 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-config\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.708147 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-nb\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.708184 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd4sc\" (UniqueName: \"kubernetes.io/projected/ebb34728-7bbe-4e28-ad43-e8913ce25a30-kube-api-access-pd4sc\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.708204 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-sb\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.709003 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-sb\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.709112 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ed6074d-7cb4-4818-82cc-023f4778fcc9" (UID: "9ed6074d-7cb4-4818-82cc-023f4778fcc9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.709964 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-dns-svc\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.710203 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ed6074d-7cb4-4818-82cc-023f4778fcc9" (UID: "9ed6074d-7cb4-4818-82cc-023f4778fcc9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.711256 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-nb\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.711875 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ed6074d-7cb4-4818-82cc-023f4778fcc9" (UID: "9ed6074d-7cb4-4818-82cc-023f4778fcc9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.711958 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-config\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.715781 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-config" (OuterVolumeSpecName: "config") pod "9ed6074d-7cb4-4818-82cc-023f4778fcc9" (UID: "9ed6074d-7cb4-4818-82cc-023f4778fcc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.720841 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed6074d-7cb4-4818-82cc-023f4778fcc9-kube-api-access-tbvwn" (OuterVolumeSpecName: "kube-api-access-tbvwn") pod "9ed6074d-7cb4-4818-82cc-023f4778fcc9" (UID: "9ed6074d-7cb4-4818-82cc-023f4778fcc9"). InnerVolumeSpecName "kube-api-access-tbvwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.741330 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd4sc\" (UniqueName: \"kubernetes.io/projected/ebb34728-7bbe-4e28-ad43-e8913ce25a30-kube-api-access-pd4sc\") pod \"dnsmasq-dns-748d7644cf-8wxm8\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: W1012 20:40:31.754918 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2df370d1_0a6c_47bf_b358_bd285d8416fc.slice/crio-4facc78482e01f29a0f85968be1d9cf6ee5e6f9d0ab826ef52e9ca23de73b137 WatchSource:0}: Error finding container 4facc78482e01f29a0f85968be1d9cf6ee5e6f9d0ab826ef52e9ca23de73b137: Status 404 returned error can't find the container with id 4facc78482e01f29a0f85968be1d9cf6ee5e6f9d0ab826ef52e9ca23de73b137 Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.764081 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.810003 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.810029 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.810038 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.810045 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed6074d-7cb4-4818-82cc-023f4778fcc9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:31 crc kubenswrapper[4773]: I1012 20:40:31.810054 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbvwn\" (UniqueName: \"kubernetes.io/projected/9ed6074d-7cb4-4818-82cc-023f4778fcc9-kube-api-access-tbvwn\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.216234 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-42mn8"] Oct 12 20:40:32 crc kubenswrapper[4773]: W1012 20:40:32.229349 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8fac23f_442a_42f7_8fcf_f026c88b0286.slice/crio-2ae12f38373e8d27e841d45185833e865125fe6ad16947e0a8f3f847bb937a47 WatchSource:0}: Error finding container 2ae12f38373e8d27e841d45185833e865125fe6ad16947e0a8f3f847bb937a47: Status 404 returned error can't find the container with id 2ae12f38373e8d27e841d45185833e865125fe6ad16947e0a8f3f847bb937a47 Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.252706 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hp22p"] Oct 12 20:40:32 crc kubenswrapper[4773]: W1012 20:40:32.280490 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda202db6a_83d7_461f_8258_618d63c95bbf.slice/crio-25b2c5e142e13732884fa4dbe7214ed7ab022995019cbc17cc0f309ae71ccc8b WatchSource:0}: Error finding container 25b2c5e142e13732884fa4dbe7214ed7ab022995019cbc17cc0f309ae71ccc8b: Status 404 returned error can't find the container with id 25b2c5e142e13732884fa4dbe7214ed7ab022995019cbc17cc0f309ae71ccc8b Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.283288 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.303836 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-748d7644cf-8wxm8"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.364239 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-n2ssp"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.368361 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.373688 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.373892 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w7zpb" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.382287 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-n2ssp"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.471022 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a202db6a-83d7-461f-8258-618d63c95bbf","Type":"ContainerStarted","Data":"25b2c5e142e13732884fa4dbe7214ed7ab022995019cbc17cc0f309ae71ccc8b"} Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.478120 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hp22p" event={"ID":"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a","Type":"ContainerStarted","Data":"8a433f325718e03186e096a06fd71816cc41c3223b6ace9b75ac55cae586d250"} Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.519101 4773 generic.go:334] "Generic (PLEG): container finished" podID="2df370d1-0a6c-47bf-b358-bd285d8416fc" containerID="974e206c400eb234689e40353dfa02e232950b7bf6798c70b92c4913aa9b3796" exitCode=0 Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.525225 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42mn8" event={"ID":"d8fac23f-442a-42f7-8fcf-f026c88b0286","Type":"ContainerStarted","Data":"2ae12f38373e8d27e841d45185833e865125fe6ad16947e0a8f3f847bb937a47"} Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.525258 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" event={"ID":"2df370d1-0a6c-47bf-b358-bd285d8416fc","Type":"ContainerDied","Data":"974e206c400eb234689e40353dfa02e232950b7bf6798c70b92c4913aa9b3796"} Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.525275 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qs9dw"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.525790 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7d6ff65f-pcr5q" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.525983 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-combined-ca-bundle\") pod \"barbican-db-sync-n2ssp\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.526275 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-db-sync-config-data\") pod \"barbican-db-sync-n2ssp\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.526323 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmd8\" (UniqueName: \"kubernetes.io/projected/33978648-4803-4e0c-9cba-d75359e55bcd-kube-api-access-vfmd8\") pod \"barbican-db-sync-n2ssp\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.529059 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qs9dw"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.529080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" event={"ID":"2df370d1-0a6c-47bf-b358-bd285d8416fc","Type":"ContainerStarted","Data":"4facc78482e01f29a0f85968be1d9cf6ee5e6f9d0ab826ef52e9ca23de73b137"} Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.529097 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" event={"ID":"ebb34728-7bbe-4e28-ad43-e8913ce25a30","Type":"ContainerStarted","Data":"da133027c3b478bfe010912f0dbe1743e3a66f824c547a7ace507b8ac9fb11f2"} Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.529994 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.533264 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.533402 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.533690 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jvhs4" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.634398 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06c5679-8ddf-4043-b3ae-4fd8986c4483-etc-machine-id\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.634453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-db-sync-config-data\") pod \"barbican-db-sync-n2ssp\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.634487 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmd8\" (UniqueName: \"kubernetes.io/projected/33978648-4803-4e0c-9cba-d75359e55bcd-kube-api-access-vfmd8\") pod \"barbican-db-sync-n2ssp\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.634536 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-scripts\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.635602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-db-sync-config-data\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.635635 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpnnj\" (UniqueName: \"kubernetes.io/projected/d06c5679-8ddf-4043-b3ae-4fd8986c4483-kube-api-access-mpnnj\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.635674 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-combined-ca-bundle\") pod \"barbican-db-sync-n2ssp\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.635705 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-combined-ca-bundle\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.635735 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-config-data\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.642190 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-combined-ca-bundle\") pod \"barbican-db-sync-n2ssp\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.652998 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-db-sync-config-data\") pod \"barbican-db-sync-n2ssp\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.668253 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmd8\" (UniqueName: \"kubernetes.io/projected/33978648-4803-4e0c-9cba-d75359e55bcd-kube-api-access-vfmd8\") pod \"barbican-db-sync-n2ssp\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.737914 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-db-sync-config-data\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.738242 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpnnj\" (UniqueName: \"kubernetes.io/projected/d06c5679-8ddf-4043-b3ae-4fd8986c4483-kube-api-access-mpnnj\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.738299 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-combined-ca-bundle\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.738321 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-config-data\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.738371 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06c5679-8ddf-4043-b3ae-4fd8986c4483-etc-machine-id\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.738441 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-scripts\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.740130 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06c5679-8ddf-4043-b3ae-4fd8986c4483-etc-machine-id\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.744612 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-config-data\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.747408 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-combined-ca-bundle\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.756500 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-scripts\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.761175 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.772106 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-db-sync-config-data\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.779166 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpnnj\" (UniqueName: \"kubernetes.io/projected/d06c5679-8ddf-4043-b3ae-4fd8986c4483-kube-api-access-mpnnj\") pod \"cinder-db-sync-qs9dw\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.839016 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7d6ff65f-pcr5q"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.852094 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c7d6ff65f-pcr5q"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.875564 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wjxlc"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.876977 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.877869 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.890523 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mzx7n" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.896329 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.896467 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.924058 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wjxlc"] Oct 12 20:40:32 crc kubenswrapper[4773]: I1012 20:40:32.983027 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.045166 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-config\") pod \"neutron-db-sync-wjxlc\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.045521 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pq6t\" (UniqueName: \"kubernetes.io/projected/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-kube-api-access-6pq6t\") pod \"neutron-db-sync-wjxlc\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.045542 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-combined-ca-bundle\") pod \"neutron-db-sync-wjxlc\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.147327 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-dns-svc\") pod \"2df370d1-0a6c-47bf-b358-bd285d8416fc\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.147397 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-nb\") pod \"2df370d1-0a6c-47bf-b358-bd285d8416fc\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.147445 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-sb\") pod \"2df370d1-0a6c-47bf-b358-bd285d8416fc\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.147485 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-config\") pod \"2df370d1-0a6c-47bf-b358-bd285d8416fc\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.147536 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2fb\" (UniqueName: \"kubernetes.io/projected/2df370d1-0a6c-47bf-b358-bd285d8416fc-kube-api-access-bf2fb\") pod \"2df370d1-0a6c-47bf-b358-bd285d8416fc\" (UID: \"2df370d1-0a6c-47bf-b358-bd285d8416fc\") " Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.147995 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-config\") pod \"neutron-db-sync-wjxlc\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.148617 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pq6t\" (UniqueName: \"kubernetes.io/projected/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-kube-api-access-6pq6t\") pod \"neutron-db-sync-wjxlc\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.148646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-combined-ca-bundle\") pod \"neutron-db-sync-wjxlc\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.155960 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df370d1-0a6c-47bf-b358-bd285d8416fc-kube-api-access-bf2fb" (OuterVolumeSpecName: "kube-api-access-bf2fb") pod "2df370d1-0a6c-47bf-b358-bd285d8416fc" (UID: "2df370d1-0a6c-47bf-b358-bd285d8416fc"). InnerVolumeSpecName "kube-api-access-bf2fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.158263 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-combined-ca-bundle\") pod \"neutron-db-sync-wjxlc\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.171227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-config\") pod \"neutron-db-sync-wjxlc\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.174392 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-config" (OuterVolumeSpecName: "config") pod "2df370d1-0a6c-47bf-b358-bd285d8416fc" (UID: "2df370d1-0a6c-47bf-b358-bd285d8416fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.175060 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pq6t\" (UniqueName: \"kubernetes.io/projected/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-kube-api-access-6pq6t\") pod \"neutron-db-sync-wjxlc\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.180320 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2df370d1-0a6c-47bf-b358-bd285d8416fc" (UID: "2df370d1-0a6c-47bf-b358-bd285d8416fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.192024 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2df370d1-0a6c-47bf-b358-bd285d8416fc" (UID: "2df370d1-0a6c-47bf-b358-bd285d8416fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.216140 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2df370d1-0a6c-47bf-b358-bd285d8416fc" (UID: "2df370d1-0a6c-47bf-b358-bd285d8416fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.235428 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.250389 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.250410 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.250421 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.250433 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df370d1-0a6c-47bf-b358-bd285d8416fc-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.250442 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2fb\" (UniqueName: \"kubernetes.io/projected/2df370d1-0a6c-47bf-b358-bd285d8416fc-kube-api-access-bf2fb\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.555476 4773 generic.go:334] "Generic (PLEG): container finished" podID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" containerID="d659d639cb5fbce92d37efde980f76656a4bd79d348b129e9032d0e7ca42ba95" exitCode=0 Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.556222 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" event={"ID":"ebb34728-7bbe-4e28-ad43-e8913ce25a30","Type":"ContainerDied","Data":"d659d639cb5fbce92d37efde980f76656a4bd79d348b129e9032d0e7ca42ba95"} Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.558830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42mn8" event={"ID":"d8fac23f-442a-42f7-8fcf-f026c88b0286","Type":"ContainerStarted","Data":"6f9cd7387b9163346d2c04ee47ba4432a434c305d3d2db5c4a9c41f348070865"} Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.561467 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" event={"ID":"2df370d1-0a6c-47bf-b358-bd285d8416fc","Type":"ContainerDied","Data":"4facc78482e01f29a0f85968be1d9cf6ee5e6f9d0ab826ef52e9ca23de73b137"} Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.561511 4773 scope.go:117] "RemoveContainer" containerID="974e206c400eb234689e40353dfa02e232950b7bf6798c70b92c4913aa9b3796" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.561616 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.631249 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-42mn8" podStartSLOduration=3.631226462 podStartE2EDuration="3.631226462s" podCreationTimestamp="2025-10-12 20:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:40:33.59814406 +0000 UTC m=+981.834442620" watchObservedRunningTime="2025-10-12 20:40:33.631226462 +0000 UTC m=+981.867525022" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.706471 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd"] Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:33.720634 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dcb7bb4dc-pqpsd"] Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.250297 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.388404 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-n2ssp"] Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.403464 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qs9dw"] Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.425853 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wjxlc"] Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.499159 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df370d1-0a6c-47bf-b358-bd285d8416fc" path="/var/lib/kubelet/pods/2df370d1-0a6c-47bf-b358-bd285d8416fc/volumes" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.500576 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed6074d-7cb4-4818-82cc-023f4778fcc9" path="/var/lib/kubelet/pods/9ed6074d-7cb4-4818-82cc-023f4778fcc9/volumes" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.576496 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wjxlc" event={"ID":"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2","Type":"ContainerStarted","Data":"d90cb50a8b34c802d150b9e5c9662d265ce05c15647415c0e0f7712f2aa2eda2"} Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.579903 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" event={"ID":"ebb34728-7bbe-4e28-ad43-e8913ce25a30","Type":"ContainerStarted","Data":"f132005610d867b2b66affbe3beb7c0416446da16e986370304f54da450d1bfa"} Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.580266 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.583268 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qs9dw" event={"ID":"d06c5679-8ddf-4043-b3ae-4fd8986c4483","Type":"ContainerStarted","Data":"c5c6923fdb550f2a0e4ba16055649f88fba92873460e9de50ff729c3ebfa1cbb"} Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.585040 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2ssp" event={"ID":"33978648-4803-4e0c-9cba-d75359e55bcd","Type":"ContainerStarted","Data":"c506764e263c29f078046874f7e3f8649e36120893b1f6e0a26ed3a7093ba613"} Oct 12 20:40:34 crc kubenswrapper[4773]: I1012 20:40:34.612815 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" podStartSLOduration=3.612793855 podStartE2EDuration="3.612793855s" podCreationTimestamp="2025-10-12 20:40:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:40:34.609957137 +0000 UTC m=+982.846255697" watchObservedRunningTime="2025-10-12 20:40:34.612793855 +0000 UTC m=+982.849092415" Oct 12 20:40:35 crc kubenswrapper[4773]: I1012 20:40:35.600670 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wjxlc" event={"ID":"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2","Type":"ContainerStarted","Data":"ba61c2e034916e80392e9a0da9e27d475fc3a5c422de8875975b995f04107aa2"} Oct 12 20:40:35 crc kubenswrapper[4773]: I1012 20:40:35.627101 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wjxlc" podStartSLOduration=3.627074498 podStartE2EDuration="3.627074498s" podCreationTimestamp="2025-10-12 20:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:40:35.621372741 +0000 UTC m=+983.857671311" watchObservedRunningTime="2025-10-12 20:40:35.627074498 +0000 UTC m=+983.863373058" Oct 12 20:40:37 crc kubenswrapper[4773]: I1012 20:40:37.628190 4773 generic.go:334] "Generic (PLEG): container finished" podID="d8fac23f-442a-42f7-8fcf-f026c88b0286" containerID="6f9cd7387b9163346d2c04ee47ba4432a434c305d3d2db5c4a9c41f348070865" exitCode=0 Oct 12 20:40:37 crc kubenswrapper[4773]: I1012 20:40:37.628393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42mn8" event={"ID":"d8fac23f-442a-42f7-8fcf-f026c88b0286","Type":"ContainerDied","Data":"6f9cd7387b9163346d2c04ee47ba4432a434c305d3d2db5c4a9c41f348070865"} Oct 12 20:40:41 crc kubenswrapper[4773]: I1012 20:40:41.765996 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:40:41 crc kubenswrapper[4773]: I1012 20:40:41.812347 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-vj269"] Oct 12 20:40:41 crc kubenswrapper[4773]: I1012 20:40:41.812567 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerName="dnsmasq-dns" containerID="cri-o://9da522991ffed53932b5dbe9c96edf5596e121322998240cd0ac870bf2ec730a" gracePeriod=10 Oct 12 20:40:42 crc kubenswrapper[4773]: I1012 20:40:42.702724 4773 generic.go:334] "Generic (PLEG): container finished" podID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerID="9da522991ffed53932b5dbe9c96edf5596e121322998240cd0ac870bf2ec730a" exitCode=0 Oct 12 20:40:42 crc kubenswrapper[4773]: I1012 20:40:42.703046 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" event={"ID":"df9b89cd-a401-439b-b7a3-2b3ddc3e780f","Type":"ContainerDied","Data":"9da522991ffed53932b5dbe9c96edf5596e121322998240cd0ac870bf2ec730a"} Oct 12 20:40:43 crc kubenswrapper[4773]: I1012 20:40:43.957489 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.250754 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.381435 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-credential-keys\") pod \"d8fac23f-442a-42f7-8fcf-f026c88b0286\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.381524 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-combined-ca-bundle\") pod \"d8fac23f-442a-42f7-8fcf-f026c88b0286\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.381577 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-scripts\") pod \"d8fac23f-442a-42f7-8fcf-f026c88b0286\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.381631 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-fernet-keys\") pod \"d8fac23f-442a-42f7-8fcf-f026c88b0286\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.381669 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtpmz\" (UniqueName: \"kubernetes.io/projected/d8fac23f-442a-42f7-8fcf-f026c88b0286-kube-api-access-mtpmz\") pod \"d8fac23f-442a-42f7-8fcf-f026c88b0286\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.381700 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-config-data\") pod \"d8fac23f-442a-42f7-8fcf-f026c88b0286\" (UID: \"d8fac23f-442a-42f7-8fcf-f026c88b0286\") " Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.388320 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d8fac23f-442a-42f7-8fcf-f026c88b0286" (UID: "d8fac23f-442a-42f7-8fcf-f026c88b0286"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.389671 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-scripts" (OuterVolumeSpecName: "scripts") pod "d8fac23f-442a-42f7-8fcf-f026c88b0286" (UID: "d8fac23f-442a-42f7-8fcf-f026c88b0286"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.389927 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8fac23f-442a-42f7-8fcf-f026c88b0286-kube-api-access-mtpmz" (OuterVolumeSpecName: "kube-api-access-mtpmz") pod "d8fac23f-442a-42f7-8fcf-f026c88b0286" (UID: "d8fac23f-442a-42f7-8fcf-f026c88b0286"). InnerVolumeSpecName "kube-api-access-mtpmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.405917 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d8fac23f-442a-42f7-8fcf-f026c88b0286" (UID: "d8fac23f-442a-42f7-8fcf-f026c88b0286"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.417294 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8fac23f-442a-42f7-8fcf-f026c88b0286" (UID: "d8fac23f-442a-42f7-8fcf-f026c88b0286"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.449766 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-config-data" (OuterVolumeSpecName: "config-data") pod "d8fac23f-442a-42f7-8fcf-f026c88b0286" (UID: "d8fac23f-442a-42f7-8fcf-f026c88b0286"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.484379 4773 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.484413 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.484424 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.484433 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.484443 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtpmz\" (UniqueName: \"kubernetes.io/projected/d8fac23f-442a-42f7-8fcf-f026c88b0286-kube-api-access-mtpmz\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.484453 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fac23f-442a-42f7-8fcf-f026c88b0286-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.719255 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42mn8" event={"ID":"d8fac23f-442a-42f7-8fcf-f026c88b0286","Type":"ContainerDied","Data":"2ae12f38373e8d27e841d45185833e865125fe6ad16947e0a8f3f847bb937a47"} Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.719556 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ae12f38373e8d27e841d45185833e865125fe6ad16947e0a8f3f847bb937a47" Oct 12 20:40:44 crc kubenswrapper[4773]: I1012 20:40:44.719287 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42mn8" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.331383 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-42mn8"] Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.353118 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-42mn8"] Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.423185 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lkm6g"] Oct 12 20:40:45 crc kubenswrapper[4773]: E1012 20:40:45.423537 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df370d1-0a6c-47bf-b358-bd285d8416fc" containerName="init" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.423554 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df370d1-0a6c-47bf-b358-bd285d8416fc" containerName="init" Oct 12 20:40:45 crc kubenswrapper[4773]: E1012 20:40:45.423806 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fac23f-442a-42f7-8fcf-f026c88b0286" containerName="keystone-bootstrap" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.423815 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fac23f-442a-42f7-8fcf-f026c88b0286" containerName="keystone-bootstrap" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.423970 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df370d1-0a6c-47bf-b358-bd285d8416fc" containerName="init" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.423992 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fac23f-442a-42f7-8fcf-f026c88b0286" containerName="keystone-bootstrap" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.424656 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.429307 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lkm6g"] Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.475382 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.475490 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h752b" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.475564 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.475615 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.505831 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-559jc\" (UniqueName: \"kubernetes.io/projected/f0fc5b21-582d-4e16-849d-050425c5482a-kube-api-access-559jc\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.505929 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-scripts\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.505964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-credential-keys\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.505991 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-config-data\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.506012 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-combined-ca-bundle\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.506058 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-fernet-keys\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.606902 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-config-data\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.606993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-combined-ca-bundle\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.607073 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-fernet-keys\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.607111 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-559jc\" (UniqueName: \"kubernetes.io/projected/f0fc5b21-582d-4e16-849d-050425c5482a-kube-api-access-559jc\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.607213 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-scripts\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.607255 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-credential-keys\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.610901 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-credential-keys\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.621893 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-fernet-keys\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.621907 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-config-data\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.621972 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-scripts\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.622494 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-combined-ca-bundle\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.624859 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-559jc\" (UniqueName: \"kubernetes.io/projected/f0fc5b21-582d-4e16-849d-050425c5482a-kube-api-access-559jc\") pod \"keystone-bootstrap-lkm6g\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:45 crc kubenswrapper[4773]: I1012 20:40:45.789270 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:40:46 crc kubenswrapper[4773]: I1012 20:40:46.493256 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8fac23f-442a-42f7-8fcf-f026c88b0286" path="/var/lib/kubelet/pods/d8fac23f-442a-42f7-8fcf-f026c88b0286/volumes" Oct 12 20:40:48 crc kubenswrapper[4773]: I1012 20:40:48.958027 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Oct 12 20:40:53 crc kubenswrapper[4773]: I1012 20:40:53.958280 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Oct 12 20:40:53 crc kubenswrapper[4773]: I1012 20:40:53.958923 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:40:55 crc kubenswrapper[4773]: E1012 20:40:55.618034 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f" Oct 12 20:40:55 crc kubenswrapper[4773]: E1012 20:40:55.618371 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mpnnj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qs9dw_openstack(d06c5679-8ddf-4043-b3ae-4fd8986c4483): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 20:40:55 crc kubenswrapper[4773]: E1012 20:40:55.619523 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qs9dw" podUID="d06c5679-8ddf-4043-b3ae-4fd8986c4483" Oct 12 20:40:55 crc kubenswrapper[4773]: E1012 20:40:55.856149 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f\\\"\"" pod="openstack/cinder-db-sync-qs9dw" podUID="d06c5679-8ddf-4043-b3ae-4fd8986c4483" Oct 12 20:40:55 crc kubenswrapper[4773]: I1012 20:40:55.991461 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.005412 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-dns-svc\") pod \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.005569 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-sb\") pod \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.005606 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-config\") pod \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.005772 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxdwc\" (UniqueName: \"kubernetes.io/projected/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-kube-api-access-rxdwc\") pod \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.005804 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-nb\") pod \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\" (UID: \"df9b89cd-a401-439b-b7a3-2b3ddc3e780f\") " Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.024769 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-kube-api-access-rxdwc" (OuterVolumeSpecName: "kube-api-access-rxdwc") pod "df9b89cd-a401-439b-b7a3-2b3ddc3e780f" (UID: "df9b89cd-a401-439b-b7a3-2b3ddc3e780f"). InnerVolumeSpecName "kube-api-access-rxdwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.056629 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df9b89cd-a401-439b-b7a3-2b3ddc3e780f" (UID: "df9b89cd-a401-439b-b7a3-2b3ddc3e780f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.065043 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df9b89cd-a401-439b-b7a3-2b3ddc3e780f" (UID: "df9b89cd-a401-439b-b7a3-2b3ddc3e780f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.082550 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-config" (OuterVolumeSpecName: "config") pod "df9b89cd-a401-439b-b7a3-2b3ddc3e780f" (UID: "df9b89cd-a401-439b-b7a3-2b3ddc3e780f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.087363 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df9b89cd-a401-439b-b7a3-2b3ddc3e780f" (UID: "df9b89cd-a401-439b-b7a3-2b3ddc3e780f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.108545 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.108573 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.108589 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxdwc\" (UniqueName: \"kubernetes.io/projected/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-kube-api-access-rxdwc\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.108600 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.108610 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9b89cd-a401-439b-b7a3-2b3ddc3e780f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.165036 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lkm6g"] Oct 12 20:40:56 crc kubenswrapper[4773]: W1012 20:40:56.177572 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0fc5b21_582d_4e16_849d_050425c5482a.slice/crio-5256c5e4d05a91972b13edeec16c6dc98f364d669321a806e1f350195fbbe9dc WatchSource:0}: Error finding container 5256c5e4d05a91972b13edeec16c6dc98f364d669321a806e1f350195fbbe9dc: Status 404 returned error can't find the container with id 5256c5e4d05a91972b13edeec16c6dc98f364d669321a806e1f350195fbbe9dc Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.859683 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lkm6g" event={"ID":"f0fc5b21-582d-4e16-849d-050425c5482a","Type":"ContainerStarted","Data":"dd19f0ea5fee38b8b8c44436a3586c627c7e813c276efd3de25295819ab15c3c"} Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.860192 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lkm6g" event={"ID":"f0fc5b21-582d-4e16-849d-050425c5482a","Type":"ContainerStarted","Data":"5256c5e4d05a91972b13edeec16c6dc98f364d669321a806e1f350195fbbe9dc"} Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.868798 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.868856 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-vj269" event={"ID":"df9b89cd-a401-439b-b7a3-2b3ddc3e780f","Type":"ContainerDied","Data":"38f3853f59831fc2bd4c170e9f444fd8880c440c951c42ecf51047e6cd1ac946"} Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.868952 4773 scope.go:117] "RemoveContainer" containerID="9da522991ffed53932b5dbe9c96edf5596e121322998240cd0ac870bf2ec730a" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.877536 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a202db6a-83d7-461f-8258-618d63c95bbf","Type":"ContainerStarted","Data":"ba04e088640c1004893c8bd2e733d0dabff485a0fc72f3ed12edb690a7da1b8d"} Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.882682 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lkm6g" podStartSLOduration=11.882661158 podStartE2EDuration="11.882661158s" podCreationTimestamp="2025-10-12 20:40:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:40:56.880090777 +0000 UTC m=+1005.116389337" watchObservedRunningTime="2025-10-12 20:40:56.882661158 +0000 UTC m=+1005.118959718" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.888357 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2ssp" event={"ID":"33978648-4803-4e0c-9cba-d75359e55bcd","Type":"ContainerStarted","Data":"bad9cc052766a1613855b92aab06323d45ecae86387ebe6845468ae2d3c5d04e"} Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.896836 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hp22p" event={"ID":"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a","Type":"ContainerStarted","Data":"80f8f1239f22eb5d2f52a46f3a64790e91aec507be8712d95563ab4bb88b8e81"} Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.902226 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-vj269"] Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.908846 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-vj269"] Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.918354 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-n2ssp" podStartSLOduration=3.7256972900000003 podStartE2EDuration="24.918333292s" podCreationTimestamp="2025-10-12 20:40:32 +0000 UTC" firstStartedPulling="2025-10-12 20:40:34.418659851 +0000 UTC m=+982.654958411" lastFinishedPulling="2025-10-12 20:40:55.611295853 +0000 UTC m=+1003.847594413" observedRunningTime="2025-10-12 20:40:56.911754431 +0000 UTC m=+1005.148052991" watchObservedRunningTime="2025-10-12 20:40:56.918333292 +0000 UTC m=+1005.154631852" Oct 12 20:40:56 crc kubenswrapper[4773]: I1012 20:40:56.931391 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hp22p" podStartSLOduration=2.588054331 podStartE2EDuration="25.931370612s" podCreationTimestamp="2025-10-12 20:40:31 +0000 UTC" firstStartedPulling="2025-10-12 20:40:32.263935381 +0000 UTC m=+980.500233941" lastFinishedPulling="2025-10-12 20:40:55.607251642 +0000 UTC m=+1003.843550222" observedRunningTime="2025-10-12 20:40:56.926625731 +0000 UTC m=+1005.162924291" watchObservedRunningTime="2025-10-12 20:40:56.931370612 +0000 UTC m=+1005.167669172" Oct 12 20:40:57 crc kubenswrapper[4773]: I1012 20:40:57.133516 4773 scope.go:117] "RemoveContainer" containerID="70abfd27e14e742db0c164d492139a097d42d38da0b770bbae0e6750d2a66376" Oct 12 20:40:57 crc kubenswrapper[4773]: I1012 20:40:57.905707 4773 generic.go:334] "Generic (PLEG): container finished" podID="25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2" containerID="ba61c2e034916e80392e9a0da9e27d475fc3a5c422de8875975b995f04107aa2" exitCode=0 Oct 12 20:40:57 crc kubenswrapper[4773]: I1012 20:40:57.905935 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wjxlc" event={"ID":"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2","Type":"ContainerDied","Data":"ba61c2e034916e80392e9a0da9e27d475fc3a5c422de8875975b995f04107aa2"} Oct 12 20:40:57 crc kubenswrapper[4773]: I1012 20:40:57.910091 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a202db6a-83d7-461f-8258-618d63c95bbf","Type":"ContainerStarted","Data":"c9d06f0b0bc33cf124ba08fb699dd167fd1a3dc2be2c329d5be0f90cc29dd0a5"} Oct 12 20:40:58 crc kubenswrapper[4773]: I1012 20:40:58.491131 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" path="/var/lib/kubelet/pods/df9b89cd-a401-439b-b7a3-2b3ddc3e780f/volumes" Oct 12 20:40:58 crc kubenswrapper[4773]: I1012 20:40:58.919637 4773 generic.go:334] "Generic (PLEG): container finished" podID="601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" containerID="80f8f1239f22eb5d2f52a46f3a64790e91aec507be8712d95563ab4bb88b8e81" exitCode=0 Oct 12 20:40:58 crc kubenswrapper[4773]: I1012 20:40:58.919761 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hp22p" event={"ID":"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a","Type":"ContainerDied","Data":"80f8f1239f22eb5d2f52a46f3a64790e91aec507be8712d95563ab4bb88b8e81"} Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.290645 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.468700 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-combined-ca-bundle\") pod \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.469232 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-config\") pod \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.469328 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pq6t\" (UniqueName: \"kubernetes.io/projected/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-kube-api-access-6pq6t\") pod \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\" (UID: \"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2\") " Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.476713 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-kube-api-access-6pq6t" (OuterVolumeSpecName: "kube-api-access-6pq6t") pod "25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2" (UID: "25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2"). InnerVolumeSpecName "kube-api-access-6pq6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.493550 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-config" (OuterVolumeSpecName: "config") pod "25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2" (UID: "25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.495685 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2" (UID: "25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.570903 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.570929 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.570939 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pq6t\" (UniqueName: \"kubernetes.io/projected/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2-kube-api-access-6pq6t\") on node \"crc\" DevicePath \"\"" Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.928013 4773 generic.go:334] "Generic (PLEG): container finished" podID="f0fc5b21-582d-4e16-849d-050425c5482a" containerID="dd19f0ea5fee38b8b8c44436a3586c627c7e813c276efd3de25295819ab15c3c" exitCode=0 Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.928113 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lkm6g" event={"ID":"f0fc5b21-582d-4e16-849d-050425c5482a","Type":"ContainerDied","Data":"dd19f0ea5fee38b8b8c44436a3586c627c7e813c276efd3de25295819ab15c3c"} Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.929119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wjxlc" event={"ID":"25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2","Type":"ContainerDied","Data":"d90cb50a8b34c802d150b9e5c9662d265ce05c15647415c0e0f7712f2aa2eda2"} Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.929150 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wjxlc" Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.929152 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d90cb50a8b34c802d150b9e5c9662d265ce05c15647415c0e0f7712f2aa2eda2" Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.930095 4773 generic.go:334] "Generic (PLEG): container finished" podID="33978648-4803-4e0c-9cba-d75359e55bcd" containerID="bad9cc052766a1613855b92aab06323d45ecae86387ebe6845468ae2d3c5d04e" exitCode=0 Oct 12 20:40:59 crc kubenswrapper[4773]: I1012 20:40:59.930167 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2ssp" event={"ID":"33978648-4803-4e0c-9cba-d75359e55bcd","Type":"ContainerDied","Data":"bad9cc052766a1613855b92aab06323d45ecae86387ebe6845468ae2d3c5d04e"} Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.166438 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d7d647849-t8kdm"] Oct 12 20:41:00 crc kubenswrapper[4773]: E1012 20:41:00.166777 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2" containerName="neutron-db-sync" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.166788 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2" containerName="neutron-db-sync" Oct 12 20:41:00 crc kubenswrapper[4773]: E1012 20:41:00.166814 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerName="init" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.166821 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerName="init" Oct 12 20:41:00 crc kubenswrapper[4773]: E1012 20:41:00.166829 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerName="dnsmasq-dns" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.166834 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerName="dnsmasq-dns" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.166985 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2" containerName="neutron-db-sync" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.166996 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9b89cd-a401-439b-b7a3-2b3ddc3e780f" containerName="dnsmasq-dns" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.168033 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.198358 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d7d647849-t8kdm"] Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.291051 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-dns-svc\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.291309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-sb\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.291347 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-nb\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.291364 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-config\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.291382 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ndg2\" (UniqueName: \"kubernetes.io/projected/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-kube-api-access-7ndg2\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.295242 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d44cb954d-ggb9c"] Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.312351 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.315250 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.315753 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mzx7n" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.320150 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.320833 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.336095 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d44cb954d-ggb9c"] Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.382505 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hp22p" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396354 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-scripts\") pod \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396396 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-logs\") pod \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396412 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-config-data\") pod \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396429 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-combined-ca-bundle\") pod \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396449 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fw9c\" (UniqueName: \"kubernetes.io/projected/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-kube-api-access-9fw9c\") pod \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\" (UID: \"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a\") " Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396552 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-nb\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-config\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396595 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ndg2\" (UniqueName: \"kubernetes.io/projected/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-kube-api-access-7ndg2\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396645 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-httpd-config\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396679 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-config\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-dns-svc\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396775 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-ovndb-tls-certs\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396793 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-sb\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396815 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-combined-ca-bundle\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.396837 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f6xm\" (UniqueName: \"kubernetes.io/projected/921796c7-7ab3-4924-bd37-a998ccfab6e3-kube-api-access-5f6xm\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.397109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-logs" (OuterVolumeSpecName: "logs") pod "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" (UID: "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.404591 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-nb\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.405783 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-sb\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.405970 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-config\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.406369 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-dns-svc\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.424023 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-scripts" (OuterVolumeSpecName: "scripts") pod "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" (UID: "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.427981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-kube-api-access-9fw9c" (OuterVolumeSpecName: "kube-api-access-9fw9c") pod "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" (UID: "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a"). InnerVolumeSpecName "kube-api-access-9fw9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.456872 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-config-data" (OuterVolumeSpecName: "config-data") pod "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" (UID: "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.464869 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ndg2\" (UniqueName: \"kubernetes.io/projected/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-kube-api-access-7ndg2\") pod \"dnsmasq-dns-6d7d647849-t8kdm\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.471927 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" (UID: "601a7ec1-3c5d-4f00-8417-5fb3ee299e8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.490503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.497882 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-combined-ca-bundle\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.497930 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f6xm\" (UniqueName: \"kubernetes.io/projected/921796c7-7ab3-4924-bd37-a998ccfab6e3-kube-api-access-5f6xm\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.497991 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-httpd-config\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.498022 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-config\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.498079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-ovndb-tls-certs\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.498121 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.498135 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-logs\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.498144 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.498153 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.498164 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fw9c\" (UniqueName: \"kubernetes.io/projected/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a-kube-api-access-9fw9c\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.502973 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-ovndb-tls-certs\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.503392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-combined-ca-bundle\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.507493 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-config\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.511655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-httpd-config\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.523063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f6xm\" (UniqueName: \"kubernetes.io/projected/921796c7-7ab3-4924-bd37-a998ccfab6e3-kube-api-access-5f6xm\") pod \"neutron-d44cb954d-ggb9c\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.696430 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.946071 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hp22p" Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.946787 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hp22p" event={"ID":"601a7ec1-3c5d-4f00-8417-5fb3ee299e8a","Type":"ContainerDied","Data":"8a433f325718e03186e096a06fd71816cc41c3223b6ace9b75ac55cae586d250"} Oct 12 20:41:00 crc kubenswrapper[4773]: I1012 20:41:00.946827 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a433f325718e03186e096a06fd71816cc41c3223b6ace9b75ac55cae586d250" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.003616 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d7d647849-t8kdm"] Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.312433 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d44cb954d-ggb9c"] Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.522740 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.534227 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.620226 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-config-data\") pod \"f0fc5b21-582d-4e16-849d-050425c5482a\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.620261 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-scripts\") pod \"f0fc5b21-582d-4e16-849d-050425c5482a\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.620289 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfmd8\" (UniqueName: \"kubernetes.io/projected/33978648-4803-4e0c-9cba-d75359e55bcd-kube-api-access-vfmd8\") pod \"33978648-4803-4e0c-9cba-d75359e55bcd\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.620326 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-combined-ca-bundle\") pod \"f0fc5b21-582d-4e16-849d-050425c5482a\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.620357 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-credential-keys\") pod \"f0fc5b21-582d-4e16-849d-050425c5482a\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.620388 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-fernet-keys\") pod \"f0fc5b21-582d-4e16-849d-050425c5482a\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.620452 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-db-sync-config-data\") pod \"33978648-4803-4e0c-9cba-d75359e55bcd\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.620494 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-559jc\" (UniqueName: \"kubernetes.io/projected/f0fc5b21-582d-4e16-849d-050425c5482a-kube-api-access-559jc\") pod \"f0fc5b21-582d-4e16-849d-050425c5482a\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.620521 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-combined-ca-bundle\") pod \"33978648-4803-4e0c-9cba-d75359e55bcd\" (UID: \"33978648-4803-4e0c-9cba-d75359e55bcd\") " Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.631434 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33978648-4803-4e0c-9cba-d75359e55bcd-kube-api-access-vfmd8" (OuterVolumeSpecName: "kube-api-access-vfmd8") pod "33978648-4803-4e0c-9cba-d75359e55bcd" (UID: "33978648-4803-4e0c-9cba-d75359e55bcd"). InnerVolumeSpecName "kube-api-access-vfmd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.634621 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f0fc5b21-582d-4e16-849d-050425c5482a" (UID: "f0fc5b21-582d-4e16-849d-050425c5482a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.635003 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0fc5b21-582d-4e16-849d-050425c5482a-kube-api-access-559jc" (OuterVolumeSpecName: "kube-api-access-559jc") pod "f0fc5b21-582d-4e16-849d-050425c5482a" (UID: "f0fc5b21-582d-4e16-849d-050425c5482a"). InnerVolumeSpecName "kube-api-access-559jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.643566 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f0fc5b21-582d-4e16-849d-050425c5482a" (UID: "f0fc5b21-582d-4e16-849d-050425c5482a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.644834 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-scripts" (OuterVolumeSpecName: "scripts") pod "f0fc5b21-582d-4e16-849d-050425c5482a" (UID: "f0fc5b21-582d-4e16-849d-050425c5482a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.675549 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "33978648-4803-4e0c-9cba-d75359e55bcd" (UID: "33978648-4803-4e0c-9cba-d75359e55bcd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.681557 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d7c87b9bb-vwlxb"] Oct 12 20:41:01 crc kubenswrapper[4773]: E1012 20:41:01.681956 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" containerName="placement-db-sync" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.681973 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" containerName="placement-db-sync" Oct 12 20:41:01 crc kubenswrapper[4773]: E1012 20:41:01.682007 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fc5b21-582d-4e16-849d-050425c5482a" containerName="keystone-bootstrap" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.682013 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fc5b21-582d-4e16-849d-050425c5482a" containerName="keystone-bootstrap" Oct 12 20:41:01 crc kubenswrapper[4773]: E1012 20:41:01.682027 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33978648-4803-4e0c-9cba-d75359e55bcd" containerName="barbican-db-sync" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.682032 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="33978648-4803-4e0c-9cba-d75359e55bcd" containerName="barbican-db-sync" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.682173 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" containerName="placement-db-sync" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.682190 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0fc5b21-582d-4e16-849d-050425c5482a" containerName="keystone-bootstrap" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.682204 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="33978648-4803-4e0c-9cba-d75359e55bcd" containerName="barbican-db-sync" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.684227 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.691185 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tczl6" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.691406 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.691563 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.691703 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.692475 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.715605 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d7c87b9bb-vwlxb"] Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.717633 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-config-data" (OuterVolumeSpecName: "config-data") pod "f0fc5b21-582d-4e16-849d-050425c5482a" (UID: "f0fc5b21-582d-4e16-849d-050425c5482a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.721982 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0fc5b21-582d-4e16-849d-050425c5482a" (UID: "f0fc5b21-582d-4e16-849d-050425c5482a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722055 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-combined-ca-bundle\") pod \"f0fc5b21-582d-4e16-849d-050425c5482a\" (UID: \"f0fc5b21-582d-4e16-849d-050425c5482a\") " Oct 12 20:41:01 crc kubenswrapper[4773]: W1012 20:41:01.722142 4773 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f0fc5b21-582d-4e16-849d-050425c5482a/volumes/kubernetes.io~secret/combined-ca-bundle Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722154 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0fc5b21-582d-4e16-849d-050425c5482a" (UID: "f0fc5b21-582d-4e16-849d-050425c5482a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722379 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-combined-ca-bundle\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722420 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjf57\" (UniqueName: \"kubernetes.io/projected/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-kube-api-access-mjf57\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-logs\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722489 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-public-tls-certs\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722511 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-scripts\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-config-data\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722605 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-internal-tls-certs\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722652 4773 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722663 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722671 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722681 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-559jc\" (UniqueName: \"kubernetes.io/projected/f0fc5b21-582d-4e16-849d-050425c5482a-kube-api-access-559jc\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722692 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722702 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722710 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfmd8\" (UniqueName: \"kubernetes.io/projected/33978648-4803-4e0c-9cba-d75359e55bcd-kube-api-access-vfmd8\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.722734 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fc5b21-582d-4e16-849d-050425c5482a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.725930 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33978648-4803-4e0c-9cba-d75359e55bcd" (UID: "33978648-4803-4e0c-9cba-d75359e55bcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.823294 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-internal-tls-certs\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.823336 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-combined-ca-bundle\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.823363 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjf57\" (UniqueName: \"kubernetes.io/projected/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-kube-api-access-mjf57\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.823390 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-logs\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.823424 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-public-tls-certs\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.823444 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-scripts\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.823472 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-config-data\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.823537 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33978648-4803-4e0c-9cba-d75359e55bcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.824623 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-logs\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.827273 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-internal-tls-certs\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.829381 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-combined-ca-bundle\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.833155 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-public-tls-certs\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.833240 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-scripts\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.833452 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-config-data\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.841004 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjf57\" (UniqueName: \"kubernetes.io/projected/c9a14159-b8fe-40c9-b7ac-6c410c02a0ab-kube-api-access-mjf57\") pod \"placement-d7c87b9bb-vwlxb\" (UID: \"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab\") " pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.954499 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2ssp" event={"ID":"33978648-4803-4e0c-9cba-d75359e55bcd","Type":"ContainerDied","Data":"c506764e263c29f078046874f7e3f8649e36120893b1f6e0a26ed3a7093ba613"} Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.954535 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c506764e263c29f078046874f7e3f8649e36120893b1f6e0a26ed3a7093ba613" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.954537 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2ssp" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.956923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lkm6g" event={"ID":"f0fc5b21-582d-4e16-849d-050425c5482a","Type":"ContainerDied","Data":"5256c5e4d05a91972b13edeec16c6dc98f364d669321a806e1f350195fbbe9dc"} Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.957014 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5256c5e4d05a91972b13edeec16c6dc98f364d669321a806e1f350195fbbe9dc" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.957121 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lkm6g" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.963345 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d44cb954d-ggb9c" event={"ID":"921796c7-7ab3-4924-bd37-a998ccfab6e3","Type":"ContainerStarted","Data":"961e997ad0ae2e634e480f584dc1b94bee53f1bac372a46f6096fdb85b964912"} Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.963387 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d44cb954d-ggb9c" event={"ID":"921796c7-7ab3-4924-bd37-a998ccfab6e3","Type":"ContainerStarted","Data":"0e644269c022118a0d1ffaf53214b01fa7fec22645564c5b06c1b23783468950"} Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.963397 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d44cb954d-ggb9c" event={"ID":"921796c7-7ab3-4924-bd37-a998ccfab6e3","Type":"ContainerStarted","Data":"902cc020dd0059f48e31bd22915e913626381fac66c2a518c3aec59e3bd859cd"} Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.963488 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.965366 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9b92330-f8db-407e-9147-b6e1b8fc7f1c" containerID="5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3" exitCode=0 Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.965411 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" event={"ID":"c9b92330-f8db-407e-9147-b6e1b8fc7f1c","Type":"ContainerDied","Data":"5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3"} Oct 12 20:41:01 crc kubenswrapper[4773]: I1012 20:41:01.965435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" event={"ID":"c9b92330-f8db-407e-9147-b6e1b8fc7f1c","Type":"ContainerStarted","Data":"1facda7c193cb23ccb311aa86e23f402a6117587cd676ecf57f3c722c0f44414"} Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.010260 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d44cb954d-ggb9c" podStartSLOduration=2.010233569 podStartE2EDuration="2.010233569s" podCreationTimestamp="2025-10-12 20:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:01.984684324 +0000 UTC m=+1010.220982884" watchObservedRunningTime="2025-10-12 20:41:02.010233569 +0000 UTC m=+1010.246532129" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.022013 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.088888 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d9b9d6b96-hvhdj"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.090794 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.096581 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h752b" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.096848 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.097000 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.097117 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.097259 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.097394 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.115660 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d9b9d6b96-hvhdj"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.230495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-fernet-keys\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.230912 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-internal-tls-certs\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.230966 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-scripts\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.231011 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-combined-ca-bundle\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.231050 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-config-data\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.231116 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rl2m\" (UniqueName: \"kubernetes.io/projected/addfad9c-82e3-4f44-883e-c88e44a3641d-kube-api-access-5rl2m\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.231154 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-credential-keys\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.231203 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-public-tls-certs\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.309548 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5bccc98b47-7pq24"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.320545 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.326992 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w7zpb" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.327147 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.327172 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.334427 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-combined-ca-bundle\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.334470 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-config-data\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.334523 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rl2m\" (UniqueName: \"kubernetes.io/projected/addfad9c-82e3-4f44-883e-c88e44a3641d-kube-api-access-5rl2m\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.334959 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-credential-keys\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.334990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-public-tls-certs\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.335017 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-fernet-keys\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.335037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-internal-tls-certs\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.335067 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-scripts\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.347047 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-fernet-keys\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.355392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-internal-tls-certs\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.358656 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-combined-ca-bundle\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.359701 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-public-tls-certs\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.361562 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-scripts\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.363137 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-config-data\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.378450 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/addfad9c-82e3-4f44-883e-c88e44a3641d-credential-keys\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.414979 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5bccc98b47-7pq24"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.431317 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rl2m\" (UniqueName: \"kubernetes.io/projected/addfad9c-82e3-4f44-883e-c88e44a3641d-kube-api-access-5rl2m\") pod \"keystone-6d9b9d6b96-hvhdj\" (UID: \"addfad9c-82e3-4f44-883e-c88e44a3641d\") " pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.468327 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj78z\" (UniqueName: \"kubernetes.io/projected/bccbf811-29d3-4a21-856b-4ae1cfb29c74-kube-api-access-bj78z\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.468398 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bccbf811-29d3-4a21-856b-4ae1cfb29c74-config-data-custom\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.468558 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbf811-29d3-4a21-856b-4ae1cfb29c74-config-data\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.468627 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccbf811-29d3-4a21-856b-4ae1cfb29c74-logs\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.468690 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbf811-29d3-4a21-856b-4ae1cfb29c74-combined-ca-bundle\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.501359 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-b8cc45894-xq876"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.502653 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.505539 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.548153 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b8cc45894-xq876"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.548213 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d7d647849-t8kdm"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.566156 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.567528 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.572353 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576006 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa331c5-06f0-4fac-b997-c68390b26f62-logs\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576103 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbf811-29d3-4a21-856b-4ae1cfb29c74-combined-ca-bundle\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576172 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj78z\" (UniqueName: \"kubernetes.io/projected/bccbf811-29d3-4a21-856b-4ae1cfb29c74-kube-api-access-bj78z\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576190 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bccbf811-29d3-4a21-856b-4ae1cfb29c74-config-data-custom\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa331c5-06f0-4fac-b997-c68390b26f62-combined-ca-bundle\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576287 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fa331c5-06f0-4fac-b997-c68390b26f62-config-data-custom\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576308 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkxbn\" (UniqueName: \"kubernetes.io/projected/1fa331c5-06f0-4fac-b997-c68390b26f62-kube-api-access-qkxbn\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576382 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbf811-29d3-4a21-856b-4ae1cfb29c74-config-data\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576406 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa331c5-06f0-4fac-b997-c68390b26f62-config-data\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576434 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccbf811-29d3-4a21-856b-4ae1cfb29c74-logs\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.576820 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccbf811-29d3-4a21-856b-4ae1cfb29c74-logs\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.588800 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbf811-29d3-4a21-856b-4ae1cfb29c74-config-data\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.596450 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj78z\" (UniqueName: \"kubernetes.io/projected/bccbf811-29d3-4a21-856b-4ae1cfb29c74-kube-api-access-bj78z\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.615295 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bccbf811-29d3-4a21-856b-4ae1cfb29c74-config-data-custom\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.647762 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbf811-29d3-4a21-856b-4ae1cfb29c74-combined-ca-bundle\") pod \"barbican-worker-5bccc98b47-7pq24\" (UID: \"bccbf811-29d3-4a21-856b-4ae1cfb29c74\") " pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677456 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-dns-svc\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677536 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp5bx\" (UniqueName: \"kubernetes.io/projected/f96ff898-cc06-439e-8dcb-880c1c0a462a-kube-api-access-cp5bx\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677564 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa331c5-06f0-4fac-b997-c68390b26f62-config-data\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677590 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-config\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa331c5-06f0-4fac-b997-c68390b26f62-logs\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677658 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa331c5-06f0-4fac-b997-c68390b26f62-combined-ca-bundle\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677711 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fa331c5-06f0-4fac-b997-c68390b26f62-config-data-custom\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.677744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkxbn\" (UniqueName: \"kubernetes.io/projected/1fa331c5-06f0-4fac-b997-c68390b26f62-kube-api-access-qkxbn\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.678698 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fa331c5-06f0-4fac-b997-c68390b26f62-logs\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.702986 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fa331c5-06f0-4fac-b997-c68390b26f62-config-data-custom\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.703363 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa331c5-06f0-4fac-b997-c68390b26f62-combined-ca-bundle\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.703859 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fa331c5-06f0-4fac-b997-c68390b26f62-config-data\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.706088 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b7484c884-9t4p6"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.707459 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.713223 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b7484c884-9t4p6"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.715535 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.728107 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.737554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkxbn\" (UniqueName: \"kubernetes.io/projected/1fa331c5-06f0-4fac-b997-c68390b26f62-kube-api-access-qkxbn\") pod \"barbican-keystone-listener-b8cc45894-xq876\" (UID: \"1fa331c5-06f0-4fac-b997-c68390b26f62\") " pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.781848 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5wn\" (UniqueName: \"kubernetes.io/projected/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-kube-api-access-br5wn\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.781911 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-config\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.781960 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-combined-ca-bundle\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.781998 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.782054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.782122 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data-custom\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.782183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.782224 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-logs\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.782255 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-dns-svc\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.782284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp5bx\" (UniqueName: \"kubernetes.io/projected/f96ff898-cc06-439e-8dcb-880c1c0a462a-kube-api-access-cp5bx\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.783629 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-config\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.784449 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.785371 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-dns-svc\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.785391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.793569 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bccc98b47-7pq24" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.804747 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp5bx\" (UniqueName: \"kubernetes.io/projected/f96ff898-cc06-439e-8dcb-880c1c0a462a-kube-api-access-cp5bx\") pod \"dnsmasq-dns-7ff5bdc4b9-kjn4w\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.849622 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d7c87b9bb-vwlxb"] Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.883208 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5wn\" (UniqueName: \"kubernetes.io/projected/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-kube-api-access-br5wn\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.883258 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-combined-ca-bundle\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.883284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.883344 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data-custom\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.883404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-logs\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.884616 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-logs\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.895137 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-combined-ca-bundle\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.895678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.895686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data-custom\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.915693 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5wn\" (UniqueName: \"kubernetes.io/projected/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-kube-api-access-br5wn\") pod \"barbican-api-7b7484c884-9t4p6\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.934120 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b8cc45894-xq876" Oct 12 20:41:02 crc kubenswrapper[4773]: I1012 20:41:02.988157 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.001533 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" event={"ID":"c9b92330-f8db-407e-9147-b6e1b8fc7f1c","Type":"ContainerStarted","Data":"36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1"} Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.001612 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" podUID="c9b92330-f8db-407e-9147-b6e1b8fc7f1c" containerName="dnsmasq-dns" containerID="cri-o://36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1" gracePeriod=10 Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.001692 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.021249 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d7c87b9bb-vwlxb" event={"ID":"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab","Type":"ContainerStarted","Data":"488fb07579572a749edb1482aee7d0b3f1a516cfa3b7095a23e4d203ed0ad5f3"} Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.033687 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" podStartSLOduration=3.033668246 podStartE2EDuration="3.033668246s" podCreationTimestamp="2025-10-12 20:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:03.033334106 +0000 UTC m=+1011.269632666" watchObservedRunningTime="2025-10-12 20:41:03.033668246 +0000 UTC m=+1011.269966806" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.048843 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.341171 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7857d9f9fc-69hqj"] Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.344152 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.360424 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7857d9f9fc-69hqj"] Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.363708 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.363881 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.409243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-ovndb-tls-certs\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.409296 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-config\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.409335 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-combined-ca-bundle\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.409367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-public-tls-certs\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.409440 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-httpd-config\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.409481 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-internal-tls-certs\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.409501 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlx7j\" (UniqueName: \"kubernetes.io/projected/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-kube-api-access-hlx7j\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.461813 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d9b9d6b96-hvhdj"] Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.471483 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5bccc98b47-7pq24"] Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.512127 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-httpd-config\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.512833 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-internal-tls-certs\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.512864 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlx7j\" (UniqueName: \"kubernetes.io/projected/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-kube-api-access-hlx7j\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.512930 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-ovndb-tls-certs\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.512957 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-config\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.512998 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-combined-ca-bundle\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.513024 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-public-tls-certs\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.518960 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-config\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.518980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-ovndb-tls-certs\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.519749 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-httpd-config\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.521266 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-internal-tls-certs\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.525876 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-public-tls-certs\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.526477 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-combined-ca-bundle\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.534915 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlx7j\" (UniqueName: \"kubernetes.io/projected/86be19ef-4d97-4e17-bfbc-3c9c8153cd76-kube-api-access-hlx7j\") pod \"neutron-7857d9f9fc-69hqj\" (UID: \"86be19ef-4d97-4e17-bfbc-3c9c8153cd76\") " pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.623015 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b8cc45894-xq876"] Oct 12 20:41:03 crc kubenswrapper[4773]: W1012 20:41:03.639533 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa331c5_06f0_4fac_b997_c68390b26f62.slice/crio-8797ba4ad8b7cd918aecc5f4a595bab02b3332a8221a33a23721478651f7d368 WatchSource:0}: Error finding container 8797ba4ad8b7cd918aecc5f4a595bab02b3332a8221a33a23721478651f7d368: Status 404 returned error can't find the container with id 8797ba4ad8b7cd918aecc5f4a595bab02b3332a8221a33a23721478651f7d368 Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.714416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w"] Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.831989 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.832242 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.921927 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-sb\") pod \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.921984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-config\") pod \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.922099 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-nb\") pod \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.922123 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-dns-svc\") pod \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.922148 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ndg2\" (UniqueName: \"kubernetes.io/projected/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-kube-api-access-7ndg2\") pod \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\" (UID: \"c9b92330-f8db-407e-9147-b6e1b8fc7f1c\") " Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.927902 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-kube-api-access-7ndg2" (OuterVolumeSpecName: "kube-api-access-7ndg2") pod "c9b92330-f8db-407e-9147-b6e1b8fc7f1c" (UID: "c9b92330-f8db-407e-9147-b6e1b8fc7f1c"). InnerVolumeSpecName "kube-api-access-7ndg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:03 crc kubenswrapper[4773]: I1012 20:41:03.991655 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b7484c884-9t4p6"] Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:03.998324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c9b92330-f8db-407e-9147-b6e1b8fc7f1c" (UID: "c9b92330-f8db-407e-9147-b6e1b8fc7f1c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.023622 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ndg2\" (UniqueName: \"kubernetes.io/projected/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-kube-api-access-7ndg2\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.023651 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.042790 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c9b92330-f8db-407e-9147-b6e1b8fc7f1c" (UID: "c9b92330-f8db-407e-9147-b6e1b8fc7f1c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.053497 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d7c87b9bb-vwlxb" event={"ID":"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab","Type":"ContainerStarted","Data":"ab833dd191d18a9b298c1aa60b42b3338ee35a65b070923db25200a70186eabf"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.053534 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d7c87b9bb-vwlxb" event={"ID":"c9a14159-b8fe-40c9-b7ac-6c410c02a0ab","Type":"ContainerStarted","Data":"5cefd9a45516169bdc8f36110637e8e51cf0cbdabe65988ab2d3723b8a4e7197"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.054789 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.054811 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.075067 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c9b92330-f8db-407e-9147-b6e1b8fc7f1c" (UID: "c9b92330-f8db-407e-9147-b6e1b8fc7f1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.077662 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" event={"ID":"f96ff898-cc06-439e-8dcb-880c1c0a462a","Type":"ContainerStarted","Data":"d39f7db3356048bbd23f89d41bb6045de2507e2c9ebbdc23f5ba764a6840392b"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.087533 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d7c87b9bb-vwlxb" podStartSLOduration=3.087510761 podStartE2EDuration="3.087510761s" podCreationTimestamp="2025-10-12 20:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:04.078227985 +0000 UTC m=+1012.314526545" watchObservedRunningTime="2025-10-12 20:41:04.087510761 +0000 UTC m=+1012.323809331" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.090295 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d9b9d6b96-hvhdj" event={"ID":"addfad9c-82e3-4f44-883e-c88e44a3641d","Type":"ContainerStarted","Data":"f90d2b46b406420924d7cec0dfd826a0a34ec1da8915328b5e81975be9a4fbf9"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.090340 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d9b9d6b96-hvhdj" event={"ID":"addfad9c-82e3-4f44-883e-c88e44a3641d","Type":"ContainerStarted","Data":"828efe88a9ddb4a51554362a36d0ef366a3746e0132b7b066973138f3ead1200"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.091254 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.096815 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-config" (OuterVolumeSpecName: "config") pod "c9b92330-f8db-407e-9147-b6e1b8fc7f1c" (UID: "c9b92330-f8db-407e-9147-b6e1b8fc7f1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.105227 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b8cc45894-xq876" event={"ID":"1fa331c5-06f0-4fac-b997-c68390b26f62","Type":"ContainerStarted","Data":"8797ba4ad8b7cd918aecc5f4a595bab02b3332a8221a33a23721478651f7d368"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.117048 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bccc98b47-7pq24" event={"ID":"bccbf811-29d3-4a21-856b-4ae1cfb29c74","Type":"ContainerStarted","Data":"3cbc84cab042ae1f5609ec2995d5f4e0d3fa721e0c202386e99d41f15a2ae320"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.126054 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.126074 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.126085 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c9b92330-f8db-407e-9147-b6e1b8fc7f1c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.132214 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6d9b9d6b96-hvhdj" podStartSLOduration=2.132194534 podStartE2EDuration="2.132194534s" podCreationTimestamp="2025-10-12 20:41:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:04.116008427 +0000 UTC m=+1012.352306987" watchObservedRunningTime="2025-10-12 20:41:04.132194534 +0000 UTC m=+1012.368493094" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.142563 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9b92330-f8db-407e-9147-b6e1b8fc7f1c" containerID="36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1" exitCode=0 Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.142622 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" event={"ID":"c9b92330-f8db-407e-9147-b6e1b8fc7f1c","Type":"ContainerDied","Data":"36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.142646 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" event={"ID":"c9b92330-f8db-407e-9147-b6e1b8fc7f1c","Type":"ContainerDied","Data":"1facda7c193cb23ccb311aa86e23f402a6117587cd676ecf57f3c722c0f44414"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.142667 4773 scope.go:117] "RemoveContainer" containerID="36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.142818 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d7d647849-t8kdm" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.152982 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b7484c884-9t4p6" event={"ID":"ac8427b5-d780-4732-ac2f-5cf6a30bb77b","Type":"ContainerStarted","Data":"be2026507717aa53ba7e68be036a888fd6c862398f7e9fbf63767eb9c0e64a4b"} Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.211070 4773 scope.go:117] "RemoveContainer" containerID="5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.238156 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d7d647849-t8kdm"] Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.242590 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d7d647849-t8kdm"] Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.262952 4773 scope.go:117] "RemoveContainer" containerID="36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1" Oct 12 20:41:04 crc kubenswrapper[4773]: E1012 20:41:04.267212 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1\": container with ID starting with 36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1 not found: ID does not exist" containerID="36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.267246 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1"} err="failed to get container status \"36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1\": rpc error: code = NotFound desc = could not find container \"36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1\": container with ID starting with 36ae2d96fc67ca6a4c1bdf5f3cf1e92c25fa7dea2436bf2d91ea9442e2477bf1 not found: ID does not exist" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.267266 4773 scope.go:117] "RemoveContainer" containerID="5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3" Oct 12 20:41:04 crc kubenswrapper[4773]: E1012 20:41:04.269867 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3\": container with ID starting with 5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3 not found: ID does not exist" containerID="5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.269932 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3"} err="failed to get container status \"5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3\": rpc error: code = NotFound desc = could not find container \"5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3\": container with ID starting with 5c658b63a85efc622c95b65fa6749243a061473b4f27dc92a8a7a9c33f6a0ec3 not found: ID does not exist" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.497709 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b92330-f8db-407e-9147-b6e1b8fc7f1c" path="/var/lib/kubelet/pods/c9b92330-f8db-407e-9147-b6e1b8fc7f1c/volumes" Oct 12 20:41:04 crc kubenswrapper[4773]: I1012 20:41:04.573416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7857d9f9fc-69hqj"] Oct 12 20:41:04 crc kubenswrapper[4773]: W1012 20:41:04.592007 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86be19ef_4d97_4e17_bfbc_3c9c8153cd76.slice/crio-bca7c4248c1a5ee95a5e6102917dfb4207e055e0958faeb3ab6f9cfa9242529d WatchSource:0}: Error finding container bca7c4248c1a5ee95a5e6102917dfb4207e055e0958faeb3ab6f9cfa9242529d: Status 404 returned error can't find the container with id bca7c4248c1a5ee95a5e6102917dfb4207e055e0958faeb3ab6f9cfa9242529d Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.165401 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b7484c884-9t4p6" event={"ID":"ac8427b5-d780-4732-ac2f-5cf6a30bb77b","Type":"ContainerStarted","Data":"054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71"} Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.165654 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b7484c884-9t4p6" event={"ID":"ac8427b5-d780-4732-ac2f-5cf6a30bb77b","Type":"ContainerStarted","Data":"6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19"} Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.165850 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.165878 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.170542 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7857d9f9fc-69hqj" event={"ID":"86be19ef-4d97-4e17-bfbc-3c9c8153cd76","Type":"ContainerStarted","Data":"48669c766dafb42b3e95160354c7f6058d28cb475fb3687fc7bf1623d83d2d82"} Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.170572 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7857d9f9fc-69hqj" event={"ID":"86be19ef-4d97-4e17-bfbc-3c9c8153cd76","Type":"ContainerStarted","Data":"b1ea7c2702411f9ca65e48e6af784717255a84506afa32daf7bfa3b964ac024d"} Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.170580 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7857d9f9fc-69hqj" event={"ID":"86be19ef-4d97-4e17-bfbc-3c9c8153cd76","Type":"ContainerStarted","Data":"bca7c4248c1a5ee95a5e6102917dfb4207e055e0958faeb3ab6f9cfa9242529d"} Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.171300 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.174572 4773 generic.go:334] "Generic (PLEG): container finished" podID="f96ff898-cc06-439e-8dcb-880c1c0a462a" containerID="db8a5f1dd5b1d540442da36b1f5cf342c6755e3c053ceada8a512e8103bc17c4" exitCode=0 Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.174616 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" event={"ID":"f96ff898-cc06-439e-8dcb-880c1c0a462a","Type":"ContainerDied","Data":"db8a5f1dd5b1d540442da36b1f5cf342c6755e3c053ceada8a512e8103bc17c4"} Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.183260 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b7484c884-9t4p6" podStartSLOduration=3.183243951 podStartE2EDuration="3.183243951s" podCreationTimestamp="2025-10-12 20:41:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:05.180212638 +0000 UTC m=+1013.416511198" watchObservedRunningTime="2025-10-12 20:41:05.183243951 +0000 UTC m=+1013.419542511" Oct 12 20:41:05 crc kubenswrapper[4773]: I1012 20:41:05.208110 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7857d9f9fc-69hqj" podStartSLOduration=2.208092537 podStartE2EDuration="2.208092537s" podCreationTimestamp="2025-10-12 20:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:05.205531126 +0000 UTC m=+1013.441829686" watchObservedRunningTime="2025-10-12 20:41:05.208092537 +0000 UTC m=+1013.444391097" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.537737 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b76446c56-rrfn7"] Oct 12 20:41:06 crc kubenswrapper[4773]: E1012 20:41:06.539369 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b92330-f8db-407e-9147-b6e1b8fc7f1c" containerName="dnsmasq-dns" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.539455 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b92330-f8db-407e-9147-b6e1b8fc7f1c" containerName="dnsmasq-dns" Oct 12 20:41:06 crc kubenswrapper[4773]: E1012 20:41:06.539518 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b92330-f8db-407e-9147-b6e1b8fc7f1c" containerName="init" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.539569 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b92330-f8db-407e-9147-b6e1b8fc7f1c" containerName="init" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.539800 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b92330-f8db-407e-9147-b6e1b8fc7f1c" containerName="dnsmasq-dns" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.540666 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.548137 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.548165 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.576136 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b76446c56-rrfn7"] Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.688131 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hswj\" (UniqueName: \"kubernetes.io/projected/44baa955-b25d-4648-aef5-423ad5992301-kube-api-access-2hswj\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.688195 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-public-tls-certs\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.688234 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-config-data\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.688333 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-combined-ca-bundle\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.688350 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44baa955-b25d-4648-aef5-423ad5992301-logs\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.688369 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-config-data-custom\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.688422 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-internal-tls-certs\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.789320 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-internal-tls-certs\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.789401 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hswj\" (UniqueName: \"kubernetes.io/projected/44baa955-b25d-4648-aef5-423ad5992301-kube-api-access-2hswj\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.789426 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-public-tls-certs\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.789448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-config-data\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.789509 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-combined-ca-bundle\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.789528 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44baa955-b25d-4648-aef5-423ad5992301-logs\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.789550 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-config-data-custom\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.794440 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-config-data-custom\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.794677 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44baa955-b25d-4648-aef5-423ad5992301-logs\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.798653 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-internal-tls-certs\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.798854 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-config-data\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.799659 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-combined-ca-bundle\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.801416 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44baa955-b25d-4648-aef5-423ad5992301-public-tls-certs\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.816353 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hswj\" (UniqueName: \"kubernetes.io/projected/44baa955-b25d-4648-aef5-423ad5992301-kube-api-access-2hswj\") pod \"barbican-api-7b76446c56-rrfn7\" (UID: \"44baa955-b25d-4648-aef5-423ad5992301\") " pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:06 crc kubenswrapper[4773]: I1012 20:41:06.860424 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:07 crc kubenswrapper[4773]: I1012 20:41:07.216260 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" event={"ID":"f96ff898-cc06-439e-8dcb-880c1c0a462a","Type":"ContainerStarted","Data":"cd9d5ac415de44b249204ce75a3a1a50d825cef2dc16b789ed54fbbd6574956d"} Oct 12 20:41:07 crc kubenswrapper[4773]: I1012 20:41:07.217834 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:07 crc kubenswrapper[4773]: I1012 20:41:07.223668 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b8cc45894-xq876" event={"ID":"1fa331c5-06f0-4fac-b997-c68390b26f62","Type":"ContainerStarted","Data":"6b5e94c7790b2779e245d5f09d12a302f3165b6408b14bf767e34cba7cad6972"} Oct 12 20:41:07 crc kubenswrapper[4773]: I1012 20:41:07.231882 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bccc98b47-7pq24" event={"ID":"bccbf811-29d3-4a21-856b-4ae1cfb29c74","Type":"ContainerStarted","Data":"338e9ea35a3b117ea130b974a3b1e07edbcc148142fc4a012c839e7f51d4abbb"} Oct 12 20:41:07 crc kubenswrapper[4773]: I1012 20:41:07.253292 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" podStartSLOduration=5.253272034 podStartE2EDuration="5.253272034s" podCreationTimestamp="2025-10-12 20:41:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:07.239867375 +0000 UTC m=+1015.476165935" watchObservedRunningTime="2025-10-12 20:41:07.253272034 +0000 UTC m=+1015.489570594" Oct 12 20:41:07 crc kubenswrapper[4773]: I1012 20:41:07.375438 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b76446c56-rrfn7"] Oct 12 20:41:07 crc kubenswrapper[4773]: W1012 20:41:07.381406 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44baa955_b25d_4648_aef5_423ad5992301.slice/crio-0b57910f1dc4337a60954f918adc384469663b6ec0a703e258cad303751841e8 WatchSource:0}: Error finding container 0b57910f1dc4337a60954f918adc384469663b6ec0a703e258cad303751841e8: Status 404 returned error can't find the container with id 0b57910f1dc4337a60954f918adc384469663b6ec0a703e258cad303751841e8 Oct 12 20:41:07 crc kubenswrapper[4773]: E1012 20:41:07.661294 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1: reading manifest sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1 in quay.io/openstack-k8s-operators/sg-core: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1" Oct 12 20:41:07 crc kubenswrapper[4773]: E1012 20:41:07.662185 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rkq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a202db6a-83d7-461f-8258-618d63c95bbf): ErrImagePull: initializing source docker://quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1: reading manifest sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1 in quay.io/openstack-k8s-operators/sg-core: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.242952 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b8cc45894-xq876" event={"ID":"1fa331c5-06f0-4fac-b997-c68390b26f62","Type":"ContainerStarted","Data":"f30ebdf5a01e1e47b06d62b82226d5e119ff2c20be791baa0d77d4eb5c4f95cd"} Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.248838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bccc98b47-7pq24" event={"ID":"bccbf811-29d3-4a21-856b-4ae1cfb29c74","Type":"ContainerStarted","Data":"63993124414fedcbce8bee380a3d41c6194d764a3995cf54b0f15cbf26e2348e"} Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.254248 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b76446c56-rrfn7" event={"ID":"44baa955-b25d-4648-aef5-423ad5992301","Type":"ContainerStarted","Data":"088bf1fdec99370c4abbb84108784231e24c504cb18ea2483d85ce9befbb2c93"} Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.254420 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b76446c56-rrfn7" event={"ID":"44baa955-b25d-4648-aef5-423ad5992301","Type":"ContainerStarted","Data":"d845c00829008b9a8bc99956e91c6de529226f9ff179f6aeafccf0d93d9e2316"} Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.254519 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.254601 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b76446c56-rrfn7" event={"ID":"44baa955-b25d-4648-aef5-423ad5992301","Type":"ContainerStarted","Data":"0b57910f1dc4337a60954f918adc384469663b6ec0a703e258cad303751841e8"} Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.254779 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.263610 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-b8cc45894-xq876" podStartSLOduration=3.080738355 podStartE2EDuration="6.263597669s" podCreationTimestamp="2025-10-12 20:41:02 +0000 UTC" firstStartedPulling="2025-10-12 20:41:03.647183307 +0000 UTC m=+1011.883481867" lastFinishedPulling="2025-10-12 20:41:06.830042621 +0000 UTC m=+1015.066341181" observedRunningTime="2025-10-12 20:41:08.260535184 +0000 UTC m=+1016.496833744" watchObservedRunningTime="2025-10-12 20:41:08.263597669 +0000 UTC m=+1016.499896229" Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.301688 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5bccc98b47-7pq24" podStartSLOduration=2.923157988 podStartE2EDuration="6.301536605s" podCreationTimestamp="2025-10-12 20:41:02 +0000 UTC" firstStartedPulling="2025-10-12 20:41:03.45078555 +0000 UTC m=+1011.687084110" lastFinishedPulling="2025-10-12 20:41:06.829164167 +0000 UTC m=+1015.065462727" observedRunningTime="2025-10-12 20:41:08.298031528 +0000 UTC m=+1016.534330108" watchObservedRunningTime="2025-10-12 20:41:08.301536605 +0000 UTC m=+1016.537835165" Oct 12 20:41:08 crc kubenswrapper[4773]: I1012 20:41:08.328845 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b76446c56-rrfn7" podStartSLOduration=2.328807917 podStartE2EDuration="2.328807917s" podCreationTimestamp="2025-10-12 20:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:08.321414503 +0000 UTC m=+1016.557713073" watchObservedRunningTime="2025-10-12 20:41:08.328807917 +0000 UTC m=+1016.565106477" Oct 12 20:41:09 crc kubenswrapper[4773]: I1012 20:41:09.265797 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qs9dw" event={"ID":"d06c5679-8ddf-4043-b3ae-4fd8986c4483","Type":"ContainerStarted","Data":"64cb79a506d6d49a490b14338ea8bc58c004c5ba6aa5a2bd60b0acc1e3f3721d"} Oct 12 20:41:12 crc kubenswrapper[4773]: I1012 20:41:12.989877 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:13 crc kubenswrapper[4773]: I1012 20:41:13.021290 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qs9dw" podStartSLOduration=7.566204873 podStartE2EDuration="41.021271338s" podCreationTimestamp="2025-10-12 20:40:32 +0000 UTC" firstStartedPulling="2025-10-12 20:40:34.442265372 +0000 UTC m=+982.678563932" lastFinishedPulling="2025-10-12 20:41:07.897331847 +0000 UTC m=+1016.133630397" observedRunningTime="2025-10-12 20:41:09.290709627 +0000 UTC m=+1017.527008187" watchObservedRunningTime="2025-10-12 20:41:13.021271338 +0000 UTC m=+1021.257569898" Oct 12 20:41:13 crc kubenswrapper[4773]: I1012 20:41:13.056059 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-748d7644cf-8wxm8"] Oct 12 20:41:13 crc kubenswrapper[4773]: I1012 20:41:13.056257 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" podUID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" containerName="dnsmasq-dns" containerID="cri-o://f132005610d867b2b66affbe3beb7c0416446da16e986370304f54da450d1bfa" gracePeriod=10 Oct 12 20:41:13 crc kubenswrapper[4773]: I1012 20:41:13.314725 4773 generic.go:334] "Generic (PLEG): container finished" podID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" containerID="f132005610d867b2b66affbe3beb7c0416446da16e986370304f54da450d1bfa" exitCode=0 Oct 12 20:41:13 crc kubenswrapper[4773]: I1012 20:41:13.314768 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" event={"ID":"ebb34728-7bbe-4e28-ad43-e8913ce25a30","Type":"ContainerDied","Data":"f132005610d867b2b66affbe3beb7c0416446da16e986370304f54da450d1bfa"} Oct 12 20:41:14 crc kubenswrapper[4773]: I1012 20:41:14.324018 4773 generic.go:334] "Generic (PLEG): container finished" podID="d06c5679-8ddf-4043-b3ae-4fd8986c4483" containerID="64cb79a506d6d49a490b14338ea8bc58c004c5ba6aa5a2bd60b0acc1e3f3721d" exitCode=0 Oct 12 20:41:14 crc kubenswrapper[4773]: I1012 20:41:14.324120 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qs9dw" event={"ID":"d06c5679-8ddf-4043-b3ae-4fd8986c4483","Type":"ContainerDied","Data":"64cb79a506d6d49a490b14338ea8bc58c004c5ba6aa5a2bd60b0acc1e3f3721d"} Oct 12 20:41:14 crc kubenswrapper[4773]: I1012 20:41:14.943474 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:14 crc kubenswrapper[4773]: I1012 20:41:14.955689 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:16 crc kubenswrapper[4773]: I1012 20:41:16.765583 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" podUID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.171173 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.179181 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229425 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-scripts\") pod \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-sb\") pod \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229517 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06c5679-8ddf-4043-b3ae-4fd8986c4483-etc-machine-id\") pod \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-config-data\") pod \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229599 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-dns-svc\") pod \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229676 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd4sc\" (UniqueName: \"kubernetes.io/projected/ebb34728-7bbe-4e28-ad43-e8913ce25a30-kube-api-access-pd4sc\") pod \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-combined-ca-bundle\") pod \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229830 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-nb\") pod \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229860 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpnnj\" (UniqueName: \"kubernetes.io/projected/d06c5679-8ddf-4043-b3ae-4fd8986c4483-kube-api-access-mpnnj\") pod \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229949 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-config\") pod \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\" (UID: \"ebb34728-7bbe-4e28-ad43-e8913ce25a30\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.229980 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-db-sync-config-data\") pod \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\" (UID: \"d06c5679-8ddf-4043-b3ae-4fd8986c4483\") " Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.230544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d06c5679-8ddf-4043-b3ae-4fd8986c4483-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d06c5679-8ddf-4043-b3ae-4fd8986c4483" (UID: "d06c5679-8ddf-4043-b3ae-4fd8986c4483"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.250943 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d06c5679-8ddf-4043-b3ae-4fd8986c4483" (UID: "d06c5679-8ddf-4043-b3ae-4fd8986c4483"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.252028 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-scripts" (OuterVolumeSpecName: "scripts") pod "d06c5679-8ddf-4043-b3ae-4fd8986c4483" (UID: "d06c5679-8ddf-4043-b3ae-4fd8986c4483"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.258371 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb34728-7bbe-4e28-ad43-e8913ce25a30-kube-api-access-pd4sc" (OuterVolumeSpecName: "kube-api-access-pd4sc") pod "ebb34728-7bbe-4e28-ad43-e8913ce25a30" (UID: "ebb34728-7bbe-4e28-ad43-e8913ce25a30"). InnerVolumeSpecName "kube-api-access-pd4sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.264126 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06c5679-8ddf-4043-b3ae-4fd8986c4483-kube-api-access-mpnnj" (OuterVolumeSpecName: "kube-api-access-mpnnj") pod "d06c5679-8ddf-4043-b3ae-4fd8986c4483" (UID: "d06c5679-8ddf-4043-b3ae-4fd8986c4483"). InnerVolumeSpecName "kube-api-access-mpnnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.296815 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d06c5679-8ddf-4043-b3ae-4fd8986c4483" (UID: "d06c5679-8ddf-4043-b3ae-4fd8986c4483"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.306625 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ebb34728-7bbe-4e28-ad43-e8913ce25a30" (UID: "ebb34728-7bbe-4e28-ad43-e8913ce25a30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.315932 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ebb34728-7bbe-4e28-ad43-e8913ce25a30" (UID: "ebb34728-7bbe-4e28-ad43-e8913ce25a30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.326086 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-config-data" (OuterVolumeSpecName: "config-data") pod "d06c5679-8ddf-4043-b3ae-4fd8986c4483" (UID: "d06c5679-8ddf-4043-b3ae-4fd8986c4483"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.328629 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-config" (OuterVolumeSpecName: "config") pod "ebb34728-7bbe-4e28-ad43-e8913ce25a30" (UID: "ebb34728-7bbe-4e28-ad43-e8913ce25a30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332516 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd4sc\" (UniqueName: \"kubernetes.io/projected/ebb34728-7bbe-4e28-ad43-e8913ce25a30-kube-api-access-pd4sc\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332550 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332560 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpnnj\" (UniqueName: \"kubernetes.io/projected/d06c5679-8ddf-4043-b3ae-4fd8986c4483-kube-api-access-mpnnj\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332571 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332581 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332590 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332597 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332605 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d06c5679-8ddf-4043-b3ae-4fd8986c4483-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332613 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06c5679-8ddf-4043-b3ae-4fd8986c4483-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.332622 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.336044 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ebb34728-7bbe-4e28-ad43-e8913ce25a30" (UID: "ebb34728-7bbe-4e28-ad43-e8913ce25a30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.362616 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qs9dw" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.362963 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qs9dw" event={"ID":"d06c5679-8ddf-4043-b3ae-4fd8986c4483","Type":"ContainerDied","Data":"c5c6923fdb550f2a0e4ba16055649f88fba92873460e9de50ff729c3ebfa1cbb"} Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.363009 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5c6923fdb550f2a0e4ba16055649f88fba92873460e9de50ff729c3ebfa1cbb" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.365530 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" event={"ID":"ebb34728-7bbe-4e28-ad43-e8913ce25a30","Type":"ContainerDied","Data":"da133027c3b478bfe010912f0dbe1743e3a66f824c547a7ace507b8ac9fb11f2"} Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.365558 4773 scope.go:117] "RemoveContainer" containerID="f132005610d867b2b66affbe3beb7c0416446da16e986370304f54da450d1bfa" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.365682 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-748d7644cf-8wxm8" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.400215 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.436519 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebb34728-7bbe-4e28-ad43-e8913ce25a30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.439214 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-748d7644cf-8wxm8"] Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.440301 4773 scope.go:117] "RemoveContainer" containerID="d659d639cb5fbce92d37efde980f76656a4bd79d348b129e9032d0e7ca42ba95" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.448166 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-748d7644cf-8wxm8"] Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.468642 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b76446c56-rrfn7" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.499399 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" path="/var/lib/kubelet/pods/ebb34728-7bbe-4e28-ad43-e8913ce25a30/volumes" Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.524622 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b7484c884-9t4p6"] Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.524868 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b7484c884-9t4p6" podUID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerName="barbican-api-log" containerID="cri-o://6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19" gracePeriod=30 Oct 12 20:41:18 crc kubenswrapper[4773]: I1012 20:41:18.524953 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b7484c884-9t4p6" podUID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerName="barbican-api" containerID="cri-o://054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71" gracePeriod=30 Oct 12 20:41:18 crc kubenswrapper[4773]: E1012 20:41:18.822737 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading blob sha256:9e839f647fa6e0f20da534ddc6cd89e0b8a25507d3b9e72d84eae4c36623c6d9: fetching blob: received unexpected HTTP status: 504 Gateway Timeout" image="registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48" Oct 12 20:41:18 crc kubenswrapper[4773]: E1012 20:41:18.823218 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rkq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a202db6a-83d7-461f-8258-618d63c95bbf): ErrImagePull: copying system image from manifest list: reading blob sha256:9e839f647fa6e0f20da534ddc6cd89e0b8a25507d3b9e72d84eae4c36623c6d9: fetching blob: received unexpected HTTP status: 504 Gateway Timeout" logger="UnhandledError" Oct 12 20:41:18 crc kubenswrapper[4773]: E1012 20:41:18.824467 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"initializing source docker://quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1: reading manifest sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1 in quay.io/openstack-k8s-operators/sg-core: received unexpected HTTP status: 504 Gateway Time-out\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"copying system image from manifest list: reading blob sha256:9e839f647fa6e0f20da534ddc6cd89e0b8a25507d3b9e72d84eae4c36623c6d9: fetching blob: received unexpected HTTP status: 504 Gateway Timeout\"]" pod="openstack/ceilometer-0" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.379767 4773 generic.go:334] "Generic (PLEG): container finished" podID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerID="6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19" exitCode=143 Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.380499 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b7484c884-9t4p6" event={"ID":"ac8427b5-d780-4732-ac2f-5cf6a30bb77b","Type":"ContainerDied","Data":"6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19"} Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.380860 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" containerName="ceilometer-notification-agent" containerID="cri-o://c9d06f0b0bc33cf124ba08fb699dd167fd1a3dc2be2c329d5be0f90cc29dd0a5" gracePeriod=30 Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.381052 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" containerName="ceilometer-central-agent" containerID="cri-o://ba04e088640c1004893c8bd2e733d0dabff485a0fc72f3ed12edb690a7da1b8d" gracePeriod=30 Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.500118 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 20:41:19 crc kubenswrapper[4773]: E1012 20:41:19.500696 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" containerName="init" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.500788 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" containerName="init" Oct 12 20:41:19 crc kubenswrapper[4773]: E1012 20:41:19.500853 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" containerName="dnsmasq-dns" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.500909 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" containerName="dnsmasq-dns" Oct 12 20:41:19 crc kubenswrapper[4773]: E1012 20:41:19.500976 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06c5679-8ddf-4043-b3ae-4fd8986c4483" containerName="cinder-db-sync" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.501030 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06c5679-8ddf-4043-b3ae-4fd8986c4483" containerName="cinder-db-sync" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.501242 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb34728-7bbe-4e28-ad43-e8913ce25a30" containerName="dnsmasq-dns" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.501309 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06c5679-8ddf-4043-b3ae-4fd8986c4483" containerName="cinder-db-sync" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.502194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.511391 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.513960 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.514231 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.514448 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jvhs4" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.515126 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.556643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.556694 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-scripts\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.556781 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.556820 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.556836 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.556870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2xp\" (UniqueName: \"kubernetes.io/projected/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-kube-api-access-vr2xp\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.618827 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc"] Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.620128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658312 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658362 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658411 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrmtf\" (UniqueName: \"kubernetes.io/projected/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-kube-api-access-hrmtf\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2xp\" (UniqueName: \"kubernetes.io/projected/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-kube-api-access-vr2xp\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-config\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658566 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-dns-svc\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658598 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658629 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658666 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-scripts\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.658848 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.661025 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc"] Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.667570 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.668824 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.681161 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-scripts\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.682156 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.698197 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2xp\" (UniqueName: \"kubernetes.io/projected/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-kube-api-access-vr2xp\") pod \"cinder-scheduler-0\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.717410 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.723790 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.726538 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.755893 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.759840 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.759907 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.759951 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrmtf\" (UniqueName: \"kubernetes.io/projected/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-kube-api-access-hrmtf\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.759983 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data-custom\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.760022 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.760042 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-scripts\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.760066 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-config\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.760096 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-dns-svc\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.760119 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttz6f\" (UniqueName: \"kubernetes.io/projected/ea720731-101f-46fa-8e96-536e44e1f379-kube-api-access-ttz6f\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.760144 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.760168 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea720731-101f-46fa-8e96-536e44e1f379-logs\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.760195 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea720731-101f-46fa-8e96-536e44e1f379-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.761247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-dns-svc\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.761294 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-config\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.765252 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.765429 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.789447 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrmtf\" (UniqueName: \"kubernetes.io/projected/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-kube-api-access-hrmtf\") pod \"dnsmasq-dns-7bdc9d6cdc-gsdxc\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.825127 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.861800 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea720731-101f-46fa-8e96-536e44e1f379-logs\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.861855 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea720731-101f-46fa-8e96-536e44e1f379-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.861897 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.861930 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.861960 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data-custom\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.861994 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-scripts\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.862036 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttz6f\" (UniqueName: \"kubernetes.io/projected/ea720731-101f-46fa-8e96-536e44e1f379-kube-api-access-ttz6f\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.862614 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea720731-101f-46fa-8e96-536e44e1f379-logs\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.862676 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea720731-101f-46fa-8e96-536e44e1f379-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.867564 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.869955 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-scripts\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.871899 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.872462 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data-custom\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.880588 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttz6f\" (UniqueName: \"kubernetes.io/projected/ea720731-101f-46fa-8e96-536e44e1f379-kube-api-access-ttz6f\") pod \"cinder-api-0\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " pod="openstack/cinder-api-0" Oct 12 20:41:19 crc kubenswrapper[4773]: I1012 20:41:19.937494 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:20 crc kubenswrapper[4773]: I1012 20:41:20.075211 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 20:41:20 crc kubenswrapper[4773]: I1012 20:41:20.100570 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 20:41:20 crc kubenswrapper[4773]: I1012 20:41:20.400798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bc10f5d-f5c8-4f38-bd5f-158c1227b657","Type":"ContainerStarted","Data":"4220d06abd0cbeff3ab2f7fd0ed12e08a24cc381904d73f371f07304c657d8f0"} Oct 12 20:41:20 crc kubenswrapper[4773]: I1012 20:41:20.407431 4773 generic.go:334] "Generic (PLEG): container finished" podID="a202db6a-83d7-461f-8258-618d63c95bbf" containerID="ba04e088640c1004893c8bd2e733d0dabff485a0fc72f3ed12edb690a7da1b8d" exitCode=0 Oct 12 20:41:20 crc kubenswrapper[4773]: I1012 20:41:20.407476 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a202db6a-83d7-461f-8258-618d63c95bbf","Type":"ContainerDied","Data":"ba04e088640c1004893c8bd2e733d0dabff485a0fc72f3ed12edb690a7da1b8d"} Oct 12 20:41:20 crc kubenswrapper[4773]: I1012 20:41:20.456276 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc"] Oct 12 20:41:20 crc kubenswrapper[4773]: W1012 20:41:20.641302 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea720731_101f_46fa_8e96_536e44e1f379.slice/crio-2fedfbacd160f754ac5c5ae155ffc7b82f61c040e630dba91ba633b7f265d096 WatchSource:0}: Error finding container 2fedfbacd160f754ac5c5ae155ffc7b82f61c040e630dba91ba633b7f265d096: Status 404 returned error can't find the container with id 2fedfbacd160f754ac5c5ae155ffc7b82f61c040e630dba91ba633b7f265d096 Oct 12 20:41:20 crc kubenswrapper[4773]: I1012 20:41:20.641606 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 20:41:21 crc kubenswrapper[4773]: I1012 20:41:21.418915 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea720731-101f-46fa-8e96-536e44e1f379","Type":"ContainerStarted","Data":"9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031"} Oct 12 20:41:21 crc kubenswrapper[4773]: I1012 20:41:21.419219 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea720731-101f-46fa-8e96-536e44e1f379","Type":"ContainerStarted","Data":"2fedfbacd160f754ac5c5ae155ffc7b82f61c040e630dba91ba633b7f265d096"} Oct 12 20:41:21 crc kubenswrapper[4773]: I1012 20:41:21.427051 4773 generic.go:334] "Generic (PLEG): container finished" podID="f973dfa8-7c8a-47bb-9685-4b4e36d479e0" containerID="b1bd1c14c551b4a3c5307519bc9506bc5c9bfadffa7f708e0d4dd031ba71d7e2" exitCode=0 Oct 12 20:41:21 crc kubenswrapper[4773]: I1012 20:41:21.427097 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" event={"ID":"f973dfa8-7c8a-47bb-9685-4b4e36d479e0","Type":"ContainerDied","Data":"b1bd1c14c551b4a3c5307519bc9506bc5c9bfadffa7f708e0d4dd031ba71d7e2"} Oct 12 20:41:21 crc kubenswrapper[4773]: I1012 20:41:21.427125 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" event={"ID":"f973dfa8-7c8a-47bb-9685-4b4e36d479e0","Type":"ContainerStarted","Data":"ce87a4b53c65cf58dab8487789f8bcf1e355b3b5b278492b8f6746c2735fdb19"} Oct 12 20:41:21 crc kubenswrapper[4773]: I1012 20:41:21.557398 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.130027 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.224077 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data\") pod \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.224380 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-combined-ca-bundle\") pod \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.224446 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data-custom\") pod \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.224699 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br5wn\" (UniqueName: \"kubernetes.io/projected/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-kube-api-access-br5wn\") pod \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.224767 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-logs\") pod \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\" (UID: \"ac8427b5-d780-4732-ac2f-5cf6a30bb77b\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.225698 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-logs" (OuterVolumeSpecName: "logs") pod "ac8427b5-d780-4732-ac2f-5cf6a30bb77b" (UID: "ac8427b5-d780-4732-ac2f-5cf6a30bb77b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.234252 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ac8427b5-d780-4732-ac2f-5cf6a30bb77b" (UID: "ac8427b5-d780-4732-ac2f-5cf6a30bb77b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.240653 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-kube-api-access-br5wn" (OuterVolumeSpecName: "kube-api-access-br5wn") pod "ac8427b5-d780-4732-ac2f-5cf6a30bb77b" (UID: "ac8427b5-d780-4732-ac2f-5cf6a30bb77b"). InnerVolumeSpecName "kube-api-access-br5wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.264109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac8427b5-d780-4732-ac2f-5cf6a30bb77b" (UID: "ac8427b5-d780-4732-ac2f-5cf6a30bb77b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.299220 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data" (OuterVolumeSpecName: "config-data") pod "ac8427b5-d780-4732-ac2f-5cf6a30bb77b" (UID: "ac8427b5-d780-4732-ac2f-5cf6a30bb77b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.327230 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.327264 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.327275 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.327283 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br5wn\" (UniqueName: \"kubernetes.io/projected/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-kube-api-access-br5wn\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.327294 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac8427b5-d780-4732-ac2f-5cf6a30bb77b-logs\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.457026 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" event={"ID":"f973dfa8-7c8a-47bb-9685-4b4e36d479e0","Type":"ContainerStarted","Data":"1ba7a991f19b7a241f058973232f6ed5c46512aa8f75d96b65970a080497f406"} Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.458139 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.460571 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bc10f5d-f5c8-4f38-bd5f-158c1227b657","Type":"ContainerStarted","Data":"d0910378f19e39a45b54f38eb54e60d59bad322b06441b81638d30185ba48005"} Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.463527 4773 generic.go:334] "Generic (PLEG): container finished" podID="a202db6a-83d7-461f-8258-618d63c95bbf" containerID="c9d06f0b0bc33cf124ba08fb699dd167fd1a3dc2be2c329d5be0f90cc29dd0a5" exitCode=0 Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.463562 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a202db6a-83d7-461f-8258-618d63c95bbf","Type":"ContainerDied","Data":"c9d06f0b0bc33cf124ba08fb699dd167fd1a3dc2be2c329d5be0f90cc29dd0a5"} Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.465346 4773 generic.go:334] "Generic (PLEG): container finished" podID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerID="054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71" exitCode=0 Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.465379 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b7484c884-9t4p6" event={"ID":"ac8427b5-d780-4732-ac2f-5cf6a30bb77b","Type":"ContainerDied","Data":"054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71"} Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.465393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b7484c884-9t4p6" event={"ID":"ac8427b5-d780-4732-ac2f-5cf6a30bb77b","Type":"ContainerDied","Data":"be2026507717aa53ba7e68be036a888fd6c862398f7e9fbf63767eb9c0e64a4b"} Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.465411 4773 scope.go:117] "RemoveContainer" containerID="054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.465502 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b7484c884-9t4p6" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.499819 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ea720731-101f-46fa-8e96-536e44e1f379" containerName="cinder-api-log" containerID="cri-o://9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031" gracePeriod=30 Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.499932 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ea720731-101f-46fa-8e96-536e44e1f379" containerName="cinder-api" containerID="cri-o://f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2" gracePeriod=30 Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.519444 4773 scope.go:117] "RemoveContainer" containerID="6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.522054 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" podStartSLOduration=3.522040744 podStartE2EDuration="3.522040744s" podCreationTimestamp="2025-10-12 20:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:22.485364923 +0000 UTC m=+1030.721663483" watchObservedRunningTime="2025-10-12 20:41:22.522040744 +0000 UTC m=+1030.758339304" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.526131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea720731-101f-46fa-8e96-536e44e1f379","Type":"ContainerStarted","Data":"f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2"} Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.526178 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.563397 4773 scope.go:117] "RemoveContainer" containerID="054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.565987 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b7484c884-9t4p6"] Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.574389 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7b7484c884-9t4p6"] Oct 12 20:41:22 crc kubenswrapper[4773]: E1012 20:41:22.579013 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71\": container with ID starting with 054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71 not found: ID does not exist" containerID="054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.579065 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71"} err="failed to get container status \"054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71\": rpc error: code = NotFound desc = could not find container \"054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71\": container with ID starting with 054f9517fd3f1fec61245992573988e29047d487dce789d5dcfb4ebcebaacc71 not found: ID does not exist" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.579189 4773 scope.go:117] "RemoveContainer" containerID="6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19" Oct 12 20:41:22 crc kubenswrapper[4773]: E1012 20:41:22.579565 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19\": container with ID starting with 6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19 not found: ID does not exist" containerID="6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.579595 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19"} err="failed to get container status \"6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19\": rpc error: code = NotFound desc = could not find container \"6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19\": container with ID starting with 6f7cdddce21bb043997e6043780a018ec3cb4a6890d46271ccda3f5db9375d19 not found: ID does not exist" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.591070 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.591050698 podStartE2EDuration="3.591050698s" podCreationTimestamp="2025-10-12 20:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:22.583385556 +0000 UTC m=+1030.819684116" watchObservedRunningTime="2025-10-12 20:41:22.591050698 +0000 UTC m=+1030.827349258" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.710909 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.732513 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-log-httpd\") pod \"a202db6a-83d7-461f-8258-618d63c95bbf\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.732599 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-config-data\") pod \"a202db6a-83d7-461f-8258-618d63c95bbf\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.732753 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-scripts\") pod \"a202db6a-83d7-461f-8258-618d63c95bbf\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.732787 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rkq8\" (UniqueName: \"kubernetes.io/projected/a202db6a-83d7-461f-8258-618d63c95bbf-kube-api-access-7rkq8\") pod \"a202db6a-83d7-461f-8258-618d63c95bbf\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.732813 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-run-httpd\") pod \"a202db6a-83d7-461f-8258-618d63c95bbf\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.732840 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-sg-core-conf-yaml\") pod \"a202db6a-83d7-461f-8258-618d63c95bbf\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.732869 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-combined-ca-bundle\") pod \"a202db6a-83d7-461f-8258-618d63c95bbf\" (UID: \"a202db6a-83d7-461f-8258-618d63c95bbf\") " Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.734195 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a202db6a-83d7-461f-8258-618d63c95bbf" (UID: "a202db6a-83d7-461f-8258-618d63c95bbf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.734235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a202db6a-83d7-461f-8258-618d63c95bbf" (UID: "a202db6a-83d7-461f-8258-618d63c95bbf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.740602 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-scripts" (OuterVolumeSpecName: "scripts") pod "a202db6a-83d7-461f-8258-618d63c95bbf" (UID: "a202db6a-83d7-461f-8258-618d63c95bbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.741051 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a202db6a-83d7-461f-8258-618d63c95bbf" (UID: "a202db6a-83d7-461f-8258-618d63c95bbf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.741155 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a202db6a-83d7-461f-8258-618d63c95bbf-kube-api-access-7rkq8" (OuterVolumeSpecName: "kube-api-access-7rkq8") pod "a202db6a-83d7-461f-8258-618d63c95bbf" (UID: "a202db6a-83d7-461f-8258-618d63c95bbf"). InnerVolumeSpecName "kube-api-access-7rkq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.779797 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a202db6a-83d7-461f-8258-618d63c95bbf" (UID: "a202db6a-83d7-461f-8258-618d63c95bbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.784083 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-config-data" (OuterVolumeSpecName: "config-data") pod "a202db6a-83d7-461f-8258-618d63c95bbf" (UID: "a202db6a-83d7-461f-8258-618d63c95bbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.835865 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.835906 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.835915 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.835924 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a202db6a-83d7-461f-8258-618d63c95bbf-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.835932 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.835939 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a202db6a-83d7-461f-8258-618d63c95bbf-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:22 crc kubenswrapper[4773]: I1012 20:41:22.835947 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rkq8\" (UniqueName: \"kubernetes.io/projected/a202db6a-83d7-461f-8258-618d63c95bbf-kube-api-access-7rkq8\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.412198 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.447124 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data-custom\") pod \"ea720731-101f-46fa-8e96-536e44e1f379\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.447376 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data\") pod \"ea720731-101f-46fa-8e96-536e44e1f379\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.447447 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea720731-101f-46fa-8e96-536e44e1f379-etc-machine-id\") pod \"ea720731-101f-46fa-8e96-536e44e1f379\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.447525 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-combined-ca-bundle\") pod \"ea720731-101f-46fa-8e96-536e44e1f379\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.447642 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-scripts\") pod \"ea720731-101f-46fa-8e96-536e44e1f379\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.447802 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea720731-101f-46fa-8e96-536e44e1f379-logs\") pod \"ea720731-101f-46fa-8e96-536e44e1f379\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.447864 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea720731-101f-46fa-8e96-536e44e1f379-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ea720731-101f-46fa-8e96-536e44e1f379" (UID: "ea720731-101f-46fa-8e96-536e44e1f379"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.448002 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttz6f\" (UniqueName: \"kubernetes.io/projected/ea720731-101f-46fa-8e96-536e44e1f379-kube-api-access-ttz6f\") pod \"ea720731-101f-46fa-8e96-536e44e1f379\" (UID: \"ea720731-101f-46fa-8e96-536e44e1f379\") " Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.448033 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea720731-101f-46fa-8e96-536e44e1f379-logs" (OuterVolumeSpecName: "logs") pod "ea720731-101f-46fa-8e96-536e44e1f379" (UID: "ea720731-101f-46fa-8e96-536e44e1f379"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.448401 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea720731-101f-46fa-8e96-536e44e1f379-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.448478 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea720731-101f-46fa-8e96-536e44e1f379-logs\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.471504 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-scripts" (OuterVolumeSpecName: "scripts") pod "ea720731-101f-46fa-8e96-536e44e1f379" (UID: "ea720731-101f-46fa-8e96-536e44e1f379"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.483849 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ea720731-101f-46fa-8e96-536e44e1f379" (UID: "ea720731-101f-46fa-8e96-536e44e1f379"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.484283 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea720731-101f-46fa-8e96-536e44e1f379-kube-api-access-ttz6f" (OuterVolumeSpecName: "kube-api-access-ttz6f") pod "ea720731-101f-46fa-8e96-536e44e1f379" (UID: "ea720731-101f-46fa-8e96-536e44e1f379"). InnerVolumeSpecName "kube-api-access-ttz6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.508686 4773 generic.go:334] "Generic (PLEG): container finished" podID="ea720731-101f-46fa-8e96-536e44e1f379" containerID="f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2" exitCode=0 Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.508862 4773 generic.go:334] "Generic (PLEG): container finished" podID="ea720731-101f-46fa-8e96-536e44e1f379" containerID="9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031" exitCode=143 Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.510185 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea720731-101f-46fa-8e96-536e44e1f379","Type":"ContainerDied","Data":"f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2"} Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.510689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea720731-101f-46fa-8e96-536e44e1f379","Type":"ContainerDied","Data":"9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031"} Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.510768 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ea720731-101f-46fa-8e96-536e44e1f379","Type":"ContainerDied","Data":"2fedfbacd160f754ac5c5ae155ffc7b82f61c040e630dba91ba633b7f265d096"} Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.510829 4773 scope.go:117] "RemoveContainer" containerID="f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.510320 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.515023 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea720731-101f-46fa-8e96-536e44e1f379" (UID: "ea720731-101f-46fa-8e96-536e44e1f379"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.520392 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bc10f5d-f5c8-4f38-bd5f-158c1227b657","Type":"ContainerStarted","Data":"22f84580858faa00e167c8f7b644b2f063bbbf99c74681fc384fb7365b056232"} Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.525512 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a202db6a-83d7-461f-8258-618d63c95bbf","Type":"ContainerDied","Data":"25b2c5e142e13732884fa4dbe7214ed7ab022995019cbc17cc0f309ae71ccc8b"} Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.525659 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.546408 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data" (OuterVolumeSpecName: "config-data") pod "ea720731-101f-46fa-8e96-536e44e1f379" (UID: "ea720731-101f-46fa-8e96-536e44e1f379"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.570246 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttz6f\" (UniqueName: \"kubernetes.io/projected/ea720731-101f-46fa-8e96-536e44e1f379-kube-api-access-ttz6f\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.570432 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.570453 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.570463 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.570473 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea720731-101f-46fa-8e96-536e44e1f379-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.570661 4773 scope.go:117] "RemoveContainer" containerID="9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.576752 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.718514204 podStartE2EDuration="4.576729794s" podCreationTimestamp="2025-10-12 20:41:19 +0000 UTC" firstStartedPulling="2025-10-12 20:41:20.180441902 +0000 UTC m=+1028.416740462" lastFinishedPulling="2025-10-12 20:41:21.038657482 +0000 UTC m=+1029.274956052" observedRunningTime="2025-10-12 20:41:23.546568812 +0000 UTC m=+1031.782867372" watchObservedRunningTime="2025-10-12 20:41:23.576729794 +0000 UTC m=+1031.813028354" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.629283 4773 scope.go:117] "RemoveContainer" containerID="f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2" Oct 12 20:41:23 crc kubenswrapper[4773]: E1012 20:41:23.631642 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2\": container with ID starting with f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2 not found: ID does not exist" containerID="f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.631683 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2"} err="failed to get container status \"f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2\": rpc error: code = NotFound desc = could not find container \"f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2\": container with ID starting with f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2 not found: ID does not exist" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.631703 4773 scope.go:117] "RemoveContainer" containerID="9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031" Oct 12 20:41:23 crc kubenswrapper[4773]: E1012 20:41:23.632050 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031\": container with ID starting with 9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031 not found: ID does not exist" containerID="9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.632093 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031"} err="failed to get container status \"9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031\": rpc error: code = NotFound desc = could not find container \"9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031\": container with ID starting with 9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031 not found: ID does not exist" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.632119 4773 scope.go:117] "RemoveContainer" containerID="f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.632414 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2"} err="failed to get container status \"f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2\": rpc error: code = NotFound desc = could not find container \"f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2\": container with ID starting with f7d22486a8aabf1a6b3eebad4ad55def2467af7914e2354b8670ee0b910e1be2 not found: ID does not exist" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.632448 4773 scope.go:117] "RemoveContainer" containerID="9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.632736 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031"} err="failed to get container status \"9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031\": rpc error: code = NotFound desc = could not find container \"9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031\": container with ID starting with 9906c46bf8d5a9c2f0f6203cbdc8b2dadbc9d8a1f1820a2837d64b2db8f30031 not found: ID does not exist" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.632752 4773 scope.go:117] "RemoveContainer" containerID="c9d06f0b0bc33cf124ba08fb699dd167fd1a3dc2be2c329d5be0f90cc29dd0a5" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.650657 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.664834 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.671734 4773 scope.go:117] "RemoveContainer" containerID="ba04e088640c1004893c8bd2e733d0dabff485a0fc72f3ed12edb690a7da1b8d" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.677428 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:23 crc kubenswrapper[4773]: E1012 20:41:23.678073 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerName="barbican-api-log" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678095 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerName="barbican-api-log" Oct 12 20:41:23 crc kubenswrapper[4773]: E1012 20:41:23.678114 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea720731-101f-46fa-8e96-536e44e1f379" containerName="cinder-api-log" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678120 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea720731-101f-46fa-8e96-536e44e1f379" containerName="cinder-api-log" Oct 12 20:41:23 crc kubenswrapper[4773]: E1012 20:41:23.678131 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" containerName="ceilometer-notification-agent" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678137 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" containerName="ceilometer-notification-agent" Oct 12 20:41:23 crc kubenswrapper[4773]: E1012 20:41:23.678154 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" containerName="ceilometer-central-agent" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678160 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" containerName="ceilometer-central-agent" Oct 12 20:41:23 crc kubenswrapper[4773]: E1012 20:41:23.678175 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea720731-101f-46fa-8e96-536e44e1f379" containerName="cinder-api" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678180 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea720731-101f-46fa-8e96-536e44e1f379" containerName="cinder-api" Oct 12 20:41:23 crc kubenswrapper[4773]: E1012 20:41:23.678194 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerName="barbican-api" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678200 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerName="barbican-api" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678345 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea720731-101f-46fa-8e96-536e44e1f379" containerName="cinder-api" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678359 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" containerName="ceilometer-notification-agent" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678383 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerName="barbican-api" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678390 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea720731-101f-46fa-8e96-536e44e1f379" containerName="cinder-api-log" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678403 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" containerName="barbican-api-log" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.678414 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" containerName="ceilometer-central-agent" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.679886 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.686521 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.687712 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.689631 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.773420 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.773483 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-log-httpd\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.773517 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-config-data\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.773559 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-scripts\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.773575 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.773610 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9mzc\" (UniqueName: \"kubernetes.io/projected/46d8717f-68d5-4e5b-823c-ef3821bf40ef-kube-api-access-c9mzc\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.773637 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-run-httpd\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.861795 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.865969 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.874637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.874698 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-log-httpd\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.874746 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-config-data\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.874790 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-scripts\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.874806 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.874840 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9mzc\" (UniqueName: \"kubernetes.io/projected/46d8717f-68d5-4e5b-823c-ef3821bf40ef-kube-api-access-c9mzc\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.874873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-run-httpd\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.875284 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-run-httpd\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.875892 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.876859 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-log-httpd\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.878359 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.879061 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.879366 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-scripts\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.881006 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.881465 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.881623 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.881761 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.882477 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-config-data\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.903569 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.909446 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9mzc\" (UniqueName: \"kubernetes.io/projected/46d8717f-68d5-4e5b-823c-ef3821bf40ef-kube-api-access-c9mzc\") pod \"ceilometer-0\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " pod="openstack/ceilometer-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.976396 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b1d281-5528-455b-8b30-b636772d29ce-logs\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.976481 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-config-data-custom\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.976533 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47b1d281-5528-455b-8b30-b636772d29ce-etc-machine-id\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.976709 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-public-tls-certs\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.976798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.976874 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.976903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-scripts\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.976983 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-config-data\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:23 crc kubenswrapper[4773]: I1012 20:41:23.977026 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmrg\" (UniqueName: \"kubernetes.io/projected/47b1d281-5528-455b-8b30-b636772d29ce-kube-api-access-bzmrg\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.007149 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079012 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-config-data\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079079 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmrg\" (UniqueName: \"kubernetes.io/projected/47b1d281-5528-455b-8b30-b636772d29ce-kube-api-access-bzmrg\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079123 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b1d281-5528-455b-8b30-b636772d29ce-logs\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079144 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-config-data-custom\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079186 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47b1d281-5528-455b-8b30-b636772d29ce-etc-machine-id\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079207 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-public-tls-certs\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079262 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079272 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47b1d281-5528-455b-8b30-b636772d29ce-etc-machine-id\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-scripts\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.079548 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b1d281-5528-455b-8b30-b636772d29ce-logs\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.085461 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.085591 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-config-data-custom\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.085996 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.086107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-scripts\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.089235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-public-tls-certs\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.089693 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b1d281-5528-455b-8b30-b636772d29ce-config-data\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.105784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmrg\" (UniqueName: \"kubernetes.io/projected/47b1d281-5528-455b-8b30-b636772d29ce-kube-api-access-bzmrg\") pod \"cinder-api-0\" (UID: \"47b1d281-5528-455b-8b30-b636772d29ce\") " pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.207851 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.492364 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a202db6a-83d7-461f-8258-618d63c95bbf" path="/var/lib/kubelet/pods/a202db6a-83d7-461f-8258-618d63c95bbf/volumes" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.493741 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8427b5-d780-4732-ac2f-5cf6a30bb77b" path="/var/lib/kubelet/pods/ac8427b5-d780-4732-ac2f-5cf6a30bb77b/volumes" Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.494297 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea720731-101f-46fa-8e96-536e44e1f379" path="/var/lib/kubelet/pods/ea720731-101f-46fa-8e96-536e44e1f379/volumes" Oct 12 20:41:24 crc kubenswrapper[4773]: W1012 20:41:24.495266 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46d8717f_68d5_4e5b_823c_ef3821bf40ef.slice/crio-c2b819761e5b80b78c817c646be7ab09250ef08de334cae51c0015556412885a WatchSource:0}: Error finding container c2b819761e5b80b78c817c646be7ab09250ef08de334cae51c0015556412885a: Status 404 returned error can't find the container with id c2b819761e5b80b78c817c646be7ab09250ef08de334cae51c0015556412885a Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.496066 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.555768 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerStarted","Data":"c2b819761e5b80b78c817c646be7ab09250ef08de334cae51c0015556412885a"} Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.659128 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 20:41:24 crc kubenswrapper[4773]: W1012 20:41:24.663587 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47b1d281_5528_455b_8b30_b636772d29ce.slice/crio-b3f156236575ff5662b698d95f96b4a7e1def5293e9a28a54650346234f152fd WatchSource:0}: Error finding container b3f156236575ff5662b698d95f96b4a7e1def5293e9a28a54650346234f152fd: Status 404 returned error can't find the container with id b3f156236575ff5662b698d95f96b4a7e1def5293e9a28a54650346234f152fd Oct 12 20:41:24 crc kubenswrapper[4773]: I1012 20:41:24.826097 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 12 20:41:25 crc kubenswrapper[4773]: I1012 20:41:25.566308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"47b1d281-5528-455b-8b30-b636772d29ce","Type":"ContainerStarted","Data":"9e4b0461c47055381dad81f33d5e127b9887ba7f487957fe716aab49ffc05f0e"} Oct 12 20:41:25 crc kubenswrapper[4773]: I1012 20:41:25.566588 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"47b1d281-5528-455b-8b30-b636772d29ce","Type":"ContainerStarted","Data":"b3f156236575ff5662b698d95f96b4a7e1def5293e9a28a54650346234f152fd"} Oct 12 20:41:25 crc kubenswrapper[4773]: I1012 20:41:25.569612 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerStarted","Data":"8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379"} Oct 12 20:41:26 crc kubenswrapper[4773]: I1012 20:41:26.580269 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"47b1d281-5528-455b-8b30-b636772d29ce","Type":"ContainerStarted","Data":"8aeb558706b5d42ffb0f2c47c4044be387a34613cfa83d087d454695033df7a2"} Oct 12 20:41:26 crc kubenswrapper[4773]: I1012 20:41:26.582751 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerStarted","Data":"2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a"} Oct 12 20:41:26 crc kubenswrapper[4773]: I1012 20:41:26.603743 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.603708809 podStartE2EDuration="3.603708809s" podCreationTimestamp="2025-10-12 20:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:26.59686078 +0000 UTC m=+1034.833159340" watchObservedRunningTime="2025-10-12 20:41:26.603708809 +0000 UTC m=+1034.840007369" Oct 12 20:41:27 crc kubenswrapper[4773]: I1012 20:41:27.590911 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 12 20:41:29 crc kubenswrapper[4773]: I1012 20:41:29.939858 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.010680 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w"] Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.010956 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" podUID="f96ff898-cc06-439e-8dcb-880c1c0a462a" containerName="dnsmasq-dns" containerID="cri-o://cd9d5ac415de44b249204ce75a3a1a50d825cef2dc16b789ed54fbbd6574956d" gracePeriod=10 Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.421128 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.499143 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.628824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerStarted","Data":"d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c"} Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.642961 4773 generic.go:334] "Generic (PLEG): container finished" podID="f96ff898-cc06-439e-8dcb-880c1c0a462a" containerID="cd9d5ac415de44b249204ce75a3a1a50d825cef2dc16b789ed54fbbd6574956d" exitCode=0 Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.643008 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" event={"ID":"f96ff898-cc06-439e-8dcb-880c1c0a462a","Type":"ContainerDied","Data":"cd9d5ac415de44b249204ce75a3a1a50d825cef2dc16b789ed54fbbd6574956d"} Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.643184 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerName="cinder-scheduler" containerID="cri-o://d0910378f19e39a45b54f38eb54e60d59bad322b06441b81638d30185ba48005" gracePeriod=30 Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.644702 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerName="probe" containerID="cri-o://22f84580858faa00e167c8f7b644b2f063bbbf99c74681fc384fb7365b056232" gracePeriod=30 Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.712086 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.722035 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.855151 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-sb\") pod \"f96ff898-cc06-439e-8dcb-880c1c0a462a\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.855201 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-dns-svc\") pod \"f96ff898-cc06-439e-8dcb-880c1c0a462a\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.855315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp5bx\" (UniqueName: \"kubernetes.io/projected/f96ff898-cc06-439e-8dcb-880c1c0a462a-kube-api-access-cp5bx\") pod \"f96ff898-cc06-439e-8dcb-880c1c0a462a\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.855335 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-nb\") pod \"f96ff898-cc06-439e-8dcb-880c1c0a462a\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.855371 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-config\") pod \"f96ff898-cc06-439e-8dcb-880c1c0a462a\" (UID: \"f96ff898-cc06-439e-8dcb-880c1c0a462a\") " Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.875984 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f96ff898-cc06-439e-8dcb-880c1c0a462a-kube-api-access-cp5bx" (OuterVolumeSpecName: "kube-api-access-cp5bx") pod "f96ff898-cc06-439e-8dcb-880c1c0a462a" (UID: "f96ff898-cc06-439e-8dcb-880c1c0a462a"). InnerVolumeSpecName "kube-api-access-cp5bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.947161 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f96ff898-cc06-439e-8dcb-880c1c0a462a" (UID: "f96ff898-cc06-439e-8dcb-880c1c0a462a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.958549 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp5bx\" (UniqueName: \"kubernetes.io/projected/f96ff898-cc06-439e-8dcb-880c1c0a462a-kube-api-access-cp5bx\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.958570 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.958723 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f96ff898-cc06-439e-8dcb-880c1c0a462a" (UID: "f96ff898-cc06-439e-8dcb-880c1c0a462a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.984516 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f96ff898-cc06-439e-8dcb-880c1c0a462a" (UID: "f96ff898-cc06-439e-8dcb-880c1c0a462a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:30 crc kubenswrapper[4773]: I1012 20:41:30.993827 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-config" (OuterVolumeSpecName: "config") pod "f96ff898-cc06-439e-8dcb-880c1c0a462a" (UID: "f96ff898-cc06-439e-8dcb-880c1c0a462a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.060171 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.060207 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.060217 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f96ff898-cc06-439e-8dcb-880c1c0a462a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.658502 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" event={"ID":"f96ff898-cc06-439e-8dcb-880c1c0a462a","Type":"ContainerDied","Data":"d39f7db3356048bbd23f89d41bb6045de2507e2c9ebbdc23f5ba764a6840392b"} Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.658848 4773 scope.go:117] "RemoveContainer" containerID="cd9d5ac415de44b249204ce75a3a1a50d825cef2dc16b789ed54fbbd6574956d" Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.658865 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w" Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.662752 4773 generic.go:334] "Generic (PLEG): container finished" podID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerID="22f84580858faa00e167c8f7b644b2f063bbbf99c74681fc384fb7365b056232" exitCode=0 Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.662782 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bc10f5d-f5c8-4f38-bd5f-158c1227b657","Type":"ContainerDied","Data":"22f84580858faa00e167c8f7b644b2f063bbbf99c74681fc384fb7365b056232"} Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.692245 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w"] Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.693639 4773 scope.go:117] "RemoveContainer" containerID="db8a5f1dd5b1d540442da36b1f5cf342c6755e3c053ceada8a512e8103bc17c4" Oct 12 20:41:31 crc kubenswrapper[4773]: I1012 20:41:31.702558 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5bdc4b9-kjn4w"] Oct 12 20:41:32 crc kubenswrapper[4773]: I1012 20:41:32.508159 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f96ff898-cc06-439e-8dcb-880c1c0a462a" path="/var/lib/kubelet/pods/f96ff898-cc06-439e-8dcb-880c1c0a462a/volumes" Oct 12 20:41:32 crc kubenswrapper[4773]: I1012 20:41:32.691425 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerStarted","Data":"26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4"} Oct 12 20:41:32 crc kubenswrapper[4773]: I1012 20:41:32.693248 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 20:41:32 crc kubenswrapper[4773]: I1012 20:41:32.713763 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.840149996 podStartE2EDuration="9.713747488s" podCreationTimestamp="2025-10-12 20:41:23 +0000 UTC" firstStartedPulling="2025-10-12 20:41:24.500229414 +0000 UTC m=+1032.736527984" lastFinishedPulling="2025-10-12 20:41:32.373826916 +0000 UTC m=+1040.610125476" observedRunningTime="2025-10-12 20:41:32.710210709 +0000 UTC m=+1040.946509269" watchObservedRunningTime="2025-10-12 20:41:32.713747488 +0000 UTC m=+1040.950046058" Oct 12 20:41:33 crc kubenswrapper[4773]: I1012 20:41:33.849967 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7857d9f9fc-69hqj" Oct 12 20:41:33 crc kubenswrapper[4773]: I1012 20:41:33.931035 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:33 crc kubenswrapper[4773]: I1012 20:41:33.938420 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d44cb954d-ggb9c"] Oct 12 20:41:33 crc kubenswrapper[4773]: I1012 20:41:33.938643 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d44cb954d-ggb9c" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerName="neutron-api" containerID="cri-o://0e644269c022118a0d1ffaf53214b01fa7fec22645564c5b06c1b23783468950" gracePeriod=30 Oct 12 20:41:33 crc kubenswrapper[4773]: I1012 20:41:33.938795 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d44cb954d-ggb9c" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerName="neutron-httpd" containerID="cri-o://961e997ad0ae2e634e480f584dc1b94bee53f1bac372a46f6096fdb85b964912" gracePeriod=30 Oct 12 20:41:34 crc kubenswrapper[4773]: I1012 20:41:34.052158 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d7c87b9bb-vwlxb" Oct 12 20:41:34 crc kubenswrapper[4773]: I1012 20:41:34.724187 4773 generic.go:334] "Generic (PLEG): container finished" podID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerID="961e997ad0ae2e634e480f584dc1b94bee53f1bac372a46f6096fdb85b964912" exitCode=0 Oct 12 20:41:34 crc kubenswrapper[4773]: I1012 20:41:34.724482 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d44cb954d-ggb9c" event={"ID":"921796c7-7ab3-4924-bd37-a998ccfab6e3","Type":"ContainerDied","Data":"961e997ad0ae2e634e480f584dc1b94bee53f1bac372a46f6096fdb85b964912"} Oct 12 20:41:34 crc kubenswrapper[4773]: I1012 20:41:34.743010 4773 generic.go:334] "Generic (PLEG): container finished" podID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerID="d0910378f19e39a45b54f38eb54e60d59bad322b06441b81638d30185ba48005" exitCode=0 Oct 12 20:41:34 crc kubenswrapper[4773]: I1012 20:41:34.743565 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bc10f5d-f5c8-4f38-bd5f-158c1227b657","Type":"ContainerDied","Data":"d0910378f19e39a45b54f38eb54e60d59bad322b06441b81638d30185ba48005"} Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.329448 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.354677 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data\") pod \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.354766 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-combined-ca-bundle\") pod \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.354807 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-etc-machine-id\") pod \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.354833 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2xp\" (UniqueName: \"kubernetes.io/projected/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-kube-api-access-vr2xp\") pod \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.354993 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-scripts\") pod \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.355061 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8bc10f5d-f5c8-4f38-bd5f-158c1227b657" (UID: "8bc10f5d-f5c8-4f38-bd5f-158c1227b657"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.355778 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data-custom\") pod \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\" (UID: \"8bc10f5d-f5c8-4f38-bd5f-158c1227b657\") " Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.356119 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.366984 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8bc10f5d-f5c8-4f38-bd5f-158c1227b657" (UID: "8bc10f5d-f5c8-4f38-bd5f-158c1227b657"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.368904 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-kube-api-access-vr2xp" (OuterVolumeSpecName: "kube-api-access-vr2xp") pod "8bc10f5d-f5c8-4f38-bd5f-158c1227b657" (UID: "8bc10f5d-f5c8-4f38-bd5f-158c1227b657"). InnerVolumeSpecName "kube-api-access-vr2xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.377061 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-scripts" (OuterVolumeSpecName: "scripts") pod "8bc10f5d-f5c8-4f38-bd5f-158c1227b657" (UID: "8bc10f5d-f5c8-4f38-bd5f-158c1227b657"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.431055 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bc10f5d-f5c8-4f38-bd5f-158c1227b657" (UID: "8bc10f5d-f5c8-4f38-bd5f-158c1227b657"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.458158 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.458191 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.458207 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.458218 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2xp\" (UniqueName: \"kubernetes.io/projected/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-kube-api-access-vr2xp\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.496878 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data" (OuterVolumeSpecName: "config-data") pod "8bc10f5d-f5c8-4f38-bd5f-158c1227b657" (UID: "8bc10f5d-f5c8-4f38-bd5f-158c1227b657"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.559132 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10f5d-f5c8-4f38-bd5f-158c1227b657-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.578100 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6d9b9d6b96-hvhdj" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.750943 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8bc10f5d-f5c8-4f38-bd5f-158c1227b657","Type":"ContainerDied","Data":"4220d06abd0cbeff3ab2f7fd0ed12e08a24cc381904d73f371f07304c657d8f0"} Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.751488 4773 scope.go:117] "RemoveContainer" containerID="22f84580858faa00e167c8f7b644b2f063bbbf99c74681fc384fb7365b056232" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.751456 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.832401 4773 scope.go:117] "RemoveContainer" containerID="d0910378f19e39a45b54f38eb54e60d59bad322b06441b81638d30185ba48005" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.841162 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.871844 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.910154 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 20:41:35 crc kubenswrapper[4773]: E1012 20:41:35.913281 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96ff898-cc06-439e-8dcb-880c1c0a462a" containerName="dnsmasq-dns" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.913386 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96ff898-cc06-439e-8dcb-880c1c0a462a" containerName="dnsmasq-dns" Oct 12 20:41:35 crc kubenswrapper[4773]: E1012 20:41:35.913440 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerName="probe" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.913503 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerName="probe" Oct 12 20:41:35 crc kubenswrapper[4773]: E1012 20:41:35.913574 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96ff898-cc06-439e-8dcb-880c1c0a462a" containerName="init" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.913627 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96ff898-cc06-439e-8dcb-880c1c0a462a" containerName="init" Oct 12 20:41:35 crc kubenswrapper[4773]: E1012 20:41:35.913706 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerName="cinder-scheduler" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.913788 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerName="cinder-scheduler" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.914086 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerName="cinder-scheduler" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.914156 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" containerName="probe" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.914248 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f96ff898-cc06-439e-8dcb-880c1c0a462a" containerName="dnsmasq-dns" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.915296 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.922286 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.923306 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.941206 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.942438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.945822 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-z4z6j" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.946049 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.946172 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 12 20:41:35 crc kubenswrapper[4773]: I1012 20:41:35.948336 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.069894 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-config-data\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.069948 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32308982-4e4e-4ca5-98d8-b173e22fa341-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.069998 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5lrw\" (UniqueName: \"kubernetes.io/projected/32308982-4e4e-4ca5-98d8-b173e22fa341-kube-api-access-p5lrw\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.070046 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k7hb\" (UniqueName: \"kubernetes.io/projected/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-kube-api-access-2k7hb\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.070134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-scripts\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.070214 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-openstack-config-secret\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.070408 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.070453 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.070638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-openstack-config\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.070690 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.173918 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-openstack-config\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.174769 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.175367 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-config-data\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.174693 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-openstack-config\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.175443 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32308982-4e4e-4ca5-98d8-b173e22fa341-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.175507 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5lrw\" (UniqueName: \"kubernetes.io/projected/32308982-4e4e-4ca5-98d8-b173e22fa341-kube-api-access-p5lrw\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.175580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32308982-4e4e-4ca5-98d8-b173e22fa341-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.175611 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k7hb\" (UniqueName: \"kubernetes.io/projected/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-kube-api-access-2k7hb\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.175637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-scripts\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.175681 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-openstack-config-secret\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.176243 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.176278 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.180497 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-scripts\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.180686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.181834 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.183149 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-openstack-config-secret\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.183383 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.183923 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32308982-4e4e-4ca5-98d8-b173e22fa341-config-data\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.195227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5lrw\" (UniqueName: \"kubernetes.io/projected/32308982-4e4e-4ca5-98d8-b173e22fa341-kube-api-access-p5lrw\") pod \"cinder-scheduler-0\" (UID: \"32308982-4e4e-4ca5-98d8-b173e22fa341\") " pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.195233 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k7hb\" (UniqueName: \"kubernetes.io/projected/3028de9d-aaa8-4c46-9cbb-a4ab147bf458-kube-api-access-2k7hb\") pod \"openstackclient\" (UID: \"3028de9d-aaa8-4c46-9cbb-a4ab147bf458\") " pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.240279 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.267741 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.490528 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc10f5d-f5c8-4f38-bd5f-158c1227b657" path="/var/lib/kubelet/pods/8bc10f5d-f5c8-4f38-bd5f-158c1227b657/volumes" Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.775381 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.850861 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 20:41:36 crc kubenswrapper[4773]: W1012 20:41:36.856385 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32308982_4e4e_4ca5_98d8_b173e22fa341.slice/crio-f9eeaf14b51fd9864b192b98665d9b437999479b548e39f2ba4bbdf7aa3ee64a WatchSource:0}: Error finding container f9eeaf14b51fd9864b192b98665d9b437999479b548e39f2ba4bbdf7aa3ee64a: Status 404 returned error can't find the container with id f9eeaf14b51fd9864b192b98665d9b437999479b548e39f2ba4bbdf7aa3ee64a Oct 12 20:41:36 crc kubenswrapper[4773]: I1012 20:41:36.982569 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 12 20:41:37 crc kubenswrapper[4773]: I1012 20:41:37.779835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32308982-4e4e-4ca5-98d8-b173e22fa341","Type":"ContainerStarted","Data":"d964a03fac34fc56ae7a03d0181d8abd54c6056b29354d579e2ed430aa0e5c7a"} Oct 12 20:41:37 crc kubenswrapper[4773]: I1012 20:41:37.780241 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32308982-4e4e-4ca5-98d8-b173e22fa341","Type":"ContainerStarted","Data":"f9eeaf14b51fd9864b192b98665d9b437999479b548e39f2ba4bbdf7aa3ee64a"} Oct 12 20:41:37 crc kubenswrapper[4773]: I1012 20:41:37.781837 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3028de9d-aaa8-4c46-9cbb-a4ab147bf458","Type":"ContainerStarted","Data":"38e31f4e21a8470daa6d12bf1ee749c99d260507df39f79a263b9250b983bf98"} Oct 12 20:41:38 crc kubenswrapper[4773]: I1012 20:41:38.795762 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32308982-4e4e-4ca5-98d8-b173e22fa341","Type":"ContainerStarted","Data":"583597ed676e6889babe76b96e86e43a7780dd9e17da92bb03234d2a90c555ae"} Oct 12 20:41:38 crc kubenswrapper[4773]: I1012 20:41:38.832072 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.832051124 podStartE2EDuration="3.832051124s" podCreationTimestamp="2025-10-12 20:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:38.816988845 +0000 UTC m=+1047.053287405" watchObservedRunningTime="2025-10-12 20:41:38.832051124 +0000 UTC m=+1047.068349684" Oct 12 20:41:41 crc kubenswrapper[4773]: I1012 20:41:41.241521 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.420974 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-97sg9"] Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.421999 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-97sg9" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.437726 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-97sg9"] Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.513134 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-hk2db"] Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.514126 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hk2db" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.529224 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hk2db"] Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.531045 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c9lp\" (UniqueName: \"kubernetes.io/projected/5c987c62-7e44-493e-b3ad-a23e02081559-kube-api-access-9c9lp\") pod \"nova-api-db-create-97sg9\" (UID: \"5c987c62-7e44-493e-b3ad-a23e02081559\") " pod="openstack/nova-api-db-create-97sg9" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.613201 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lqr4m"] Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.614677 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lqr4m" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.630130 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lqr4m"] Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.634861 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mjfs\" (UniqueName: \"kubernetes.io/projected/6cf26e17-d821-48a7-8b36-55b6c8616c22-kube-api-access-8mjfs\") pod \"nova-cell0-db-create-hk2db\" (UID: \"6cf26e17-d821-48a7-8b36-55b6c8616c22\") " pod="openstack/nova-cell0-db-create-hk2db" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.634919 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79sjm\" (UniqueName: \"kubernetes.io/projected/1950e421-2f64-4dd1-ba88-acb86d2921dc-kube-api-access-79sjm\") pod \"nova-cell1-db-create-lqr4m\" (UID: \"1950e421-2f64-4dd1-ba88-acb86d2921dc\") " pod="openstack/nova-cell1-db-create-lqr4m" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.634967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c9lp\" (UniqueName: \"kubernetes.io/projected/5c987c62-7e44-493e-b3ad-a23e02081559-kube-api-access-9c9lp\") pod \"nova-api-db-create-97sg9\" (UID: \"5c987c62-7e44-493e-b3ad-a23e02081559\") " pod="openstack/nova-api-db-create-97sg9" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.698557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c9lp\" (UniqueName: \"kubernetes.io/projected/5c987c62-7e44-493e-b3ad-a23e02081559-kube-api-access-9c9lp\") pod \"nova-api-db-create-97sg9\" (UID: \"5c987c62-7e44-493e-b3ad-a23e02081559\") " pod="openstack/nova-api-db-create-97sg9" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.736742 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mjfs\" (UniqueName: \"kubernetes.io/projected/6cf26e17-d821-48a7-8b36-55b6c8616c22-kube-api-access-8mjfs\") pod \"nova-cell0-db-create-hk2db\" (UID: \"6cf26e17-d821-48a7-8b36-55b6c8616c22\") " pod="openstack/nova-cell0-db-create-hk2db" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.736973 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79sjm\" (UniqueName: \"kubernetes.io/projected/1950e421-2f64-4dd1-ba88-acb86d2921dc-kube-api-access-79sjm\") pod \"nova-cell1-db-create-lqr4m\" (UID: \"1950e421-2f64-4dd1-ba88-acb86d2921dc\") " pod="openstack/nova-cell1-db-create-lqr4m" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.742930 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-97sg9" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.753963 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mjfs\" (UniqueName: \"kubernetes.io/projected/6cf26e17-d821-48a7-8b36-55b6c8616c22-kube-api-access-8mjfs\") pod \"nova-cell0-db-create-hk2db\" (UID: \"6cf26e17-d821-48a7-8b36-55b6c8616c22\") " pod="openstack/nova-cell0-db-create-hk2db" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.768855 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79sjm\" (UniqueName: \"kubernetes.io/projected/1950e421-2f64-4dd1-ba88-acb86d2921dc-kube-api-access-79sjm\") pod \"nova-cell1-db-create-lqr4m\" (UID: \"1950e421-2f64-4dd1-ba88-acb86d2921dc\") " pod="openstack/nova-cell1-db-create-lqr4m" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.835243 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hk2db" Oct 12 20:41:43 crc kubenswrapper[4773]: I1012 20:41:43.936732 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lqr4m" Oct 12 20:41:46 crc kubenswrapper[4773]: I1012 20:41:46.516100 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.630734 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-97sg9"] Oct 12 20:41:48 crc kubenswrapper[4773]: W1012 20:41:48.644911 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c987c62_7e44_493e_b3ad_a23e02081559.slice/crio-07ceea0a43a6371709e9eaad689dd6b3e204b187c71de81c1c37c9e55cef89ce WatchSource:0}: Error finding container 07ceea0a43a6371709e9eaad689dd6b3e204b187c71de81c1c37c9e55cef89ce: Status 404 returned error can't find the container with id 07ceea0a43a6371709e9eaad689dd6b3e204b187c71de81c1c37c9e55cef89ce Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.724488 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lqr4m"] Oct 12 20:41:48 crc kubenswrapper[4773]: W1012 20:41:48.729459 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1950e421_2f64_4dd1_ba88_acb86d2921dc.slice/crio-e154fc40bcd8315f6642f914f4f9d909fab966e74601ce81952aa6818add7bd0 WatchSource:0}: Error finding container e154fc40bcd8315f6642f914f4f9d909fab966e74601ce81952aa6818add7bd0: Status 404 returned error can't find the container with id e154fc40bcd8315f6642f914f4f9d909fab966e74601ce81952aa6818add7bd0 Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.793032 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hk2db"] Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.881509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3028de9d-aaa8-4c46-9cbb-a4ab147bf458","Type":"ContainerStarted","Data":"b230f62d105bc548886ff7c8f7b65a3f178ac2633e5ecdaaa3b7cc19de4e2e6b"} Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.884631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lqr4m" event={"ID":"1950e421-2f64-4dd1-ba88-acb86d2921dc","Type":"ContainerStarted","Data":"e154fc40bcd8315f6642f914f4f9d909fab966e74601ce81952aa6818add7bd0"} Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.886939 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hk2db" event={"ID":"6cf26e17-d821-48a7-8b36-55b6c8616c22","Type":"ContainerStarted","Data":"9e4b9509d89feb54a5bf39e6933f41df5e4b2d88b7c641ebde3e7898daf5464b"} Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.889459 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-97sg9" event={"ID":"5c987c62-7e44-493e-b3ad-a23e02081559","Type":"ContainerStarted","Data":"4db74c28f0b720521375ec3574e54e8c2705fd71f170c06cebb52477d8d70b6b"} Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.889484 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-97sg9" event={"ID":"5c987c62-7e44-493e-b3ad-a23e02081559","Type":"ContainerStarted","Data":"07ceea0a43a6371709e9eaad689dd6b3e204b187c71de81c1c37c9e55cef89ce"} Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.907009 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.5511469079999998 podStartE2EDuration="13.906991812s" podCreationTimestamp="2025-10-12 20:41:35 +0000 UTC" firstStartedPulling="2025-10-12 20:41:36.791183828 +0000 UTC m=+1045.027482388" lastFinishedPulling="2025-10-12 20:41:48.147028732 +0000 UTC m=+1056.383327292" observedRunningTime="2025-10-12 20:41:48.899852324 +0000 UTC m=+1057.136150884" watchObservedRunningTime="2025-10-12 20:41:48.906991812 +0000 UTC m=+1057.143290372" Oct 12 20:41:48 crc kubenswrapper[4773]: I1012 20:41:48.942882 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-97sg9" podStartSLOduration=5.94286214 podStartE2EDuration="5.94286214s" podCreationTimestamp="2025-10-12 20:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:41:48.937011957 +0000 UTC m=+1057.173310517" watchObservedRunningTime="2025-10-12 20:41:48.94286214 +0000 UTC m=+1057.179160700" Oct 12 20:41:49 crc kubenswrapper[4773]: I1012 20:41:49.898136 4773 generic.go:334] "Generic (PLEG): container finished" podID="6cf26e17-d821-48a7-8b36-55b6c8616c22" containerID="8243b170b608c1ea3f5f21efcfa797fbd22df9ee249a4262b81dfd80fea67fad" exitCode=0 Oct 12 20:41:49 crc kubenswrapper[4773]: I1012 20:41:49.898242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hk2db" event={"ID":"6cf26e17-d821-48a7-8b36-55b6c8616c22","Type":"ContainerDied","Data":"8243b170b608c1ea3f5f21efcfa797fbd22df9ee249a4262b81dfd80fea67fad"} Oct 12 20:41:49 crc kubenswrapper[4773]: I1012 20:41:49.901008 4773 generic.go:334] "Generic (PLEG): container finished" podID="5c987c62-7e44-493e-b3ad-a23e02081559" containerID="4db74c28f0b720521375ec3574e54e8c2705fd71f170c06cebb52477d8d70b6b" exitCode=0 Oct 12 20:41:49 crc kubenswrapper[4773]: I1012 20:41:49.901055 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-97sg9" event={"ID":"5c987c62-7e44-493e-b3ad-a23e02081559","Type":"ContainerDied","Data":"4db74c28f0b720521375ec3574e54e8c2705fd71f170c06cebb52477d8d70b6b"} Oct 12 20:41:49 crc kubenswrapper[4773]: I1012 20:41:49.903047 4773 generic.go:334] "Generic (PLEG): container finished" podID="1950e421-2f64-4dd1-ba88-acb86d2921dc" containerID="50792f005db49478681242fb97b47cdb8c40d62b068a9cf0b080904cdb0923cf" exitCode=0 Oct 12 20:41:49 crc kubenswrapper[4773]: I1012 20:41:49.903128 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lqr4m" event={"ID":"1950e421-2f64-4dd1-ba88-acb86d2921dc","Type":"ContainerDied","Data":"50792f005db49478681242fb97b47cdb8c40d62b068a9cf0b080904cdb0923cf"} Oct 12 20:41:50 crc kubenswrapper[4773]: I1012 20:41:50.836051 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:50 crc kubenswrapper[4773]: I1012 20:41:50.836369 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="ceilometer-central-agent" containerID="cri-o://8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379" gracePeriod=30 Oct 12 20:41:50 crc kubenswrapper[4773]: I1012 20:41:50.836495 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="sg-core" containerID="cri-o://d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c" gracePeriod=30 Oct 12 20:41:50 crc kubenswrapper[4773]: I1012 20:41:50.836540 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="ceilometer-notification-agent" containerID="cri-o://2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a" gracePeriod=30 Oct 12 20:41:50 crc kubenswrapper[4773]: I1012 20:41:50.836568 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="proxy-httpd" containerID="cri-o://26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4" gracePeriod=30 Oct 12 20:41:50 crc kubenswrapper[4773]: I1012 20:41:50.859130 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.152:3000/\": EOF" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.387366 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-97sg9" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.412903 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lqr4m" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.447320 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hk2db" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.588726 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c9lp\" (UniqueName: \"kubernetes.io/projected/5c987c62-7e44-493e-b3ad-a23e02081559-kube-api-access-9c9lp\") pod \"5c987c62-7e44-493e-b3ad-a23e02081559\" (UID: \"5c987c62-7e44-493e-b3ad-a23e02081559\") " Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.588801 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mjfs\" (UniqueName: \"kubernetes.io/projected/6cf26e17-d821-48a7-8b36-55b6c8616c22-kube-api-access-8mjfs\") pod \"6cf26e17-d821-48a7-8b36-55b6c8616c22\" (UID: \"6cf26e17-d821-48a7-8b36-55b6c8616c22\") " Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.588845 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79sjm\" (UniqueName: \"kubernetes.io/projected/1950e421-2f64-4dd1-ba88-acb86d2921dc-kube-api-access-79sjm\") pod \"1950e421-2f64-4dd1-ba88-acb86d2921dc\" (UID: \"1950e421-2f64-4dd1-ba88-acb86d2921dc\") " Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.595320 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c987c62-7e44-493e-b3ad-a23e02081559-kube-api-access-9c9lp" (OuterVolumeSpecName: "kube-api-access-9c9lp") pod "5c987c62-7e44-493e-b3ad-a23e02081559" (UID: "5c987c62-7e44-493e-b3ad-a23e02081559"). InnerVolumeSpecName "kube-api-access-9c9lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.595392 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf26e17-d821-48a7-8b36-55b6c8616c22-kube-api-access-8mjfs" (OuterVolumeSpecName: "kube-api-access-8mjfs") pod "6cf26e17-d821-48a7-8b36-55b6c8616c22" (UID: "6cf26e17-d821-48a7-8b36-55b6c8616c22"). InnerVolumeSpecName "kube-api-access-8mjfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.596836 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1950e421-2f64-4dd1-ba88-acb86d2921dc-kube-api-access-79sjm" (OuterVolumeSpecName: "kube-api-access-79sjm") pod "1950e421-2f64-4dd1-ba88-acb86d2921dc" (UID: "1950e421-2f64-4dd1-ba88-acb86d2921dc"). InnerVolumeSpecName "kube-api-access-79sjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.691545 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c9lp\" (UniqueName: \"kubernetes.io/projected/5c987c62-7e44-493e-b3ad-a23e02081559-kube-api-access-9c9lp\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.691589 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mjfs\" (UniqueName: \"kubernetes.io/projected/6cf26e17-d821-48a7-8b36-55b6c8616c22-kube-api-access-8mjfs\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.691605 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79sjm\" (UniqueName: \"kubernetes.io/projected/1950e421-2f64-4dd1-ba88-acb86d2921dc-kube-api-access-79sjm\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.930580 4773 generic.go:334] "Generic (PLEG): container finished" podID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerID="26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4" exitCode=0 Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.930609 4773 generic.go:334] "Generic (PLEG): container finished" podID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerID="d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c" exitCode=2 Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.930619 4773 generic.go:334] "Generic (PLEG): container finished" podID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerID="8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379" exitCode=0 Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.930655 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerDied","Data":"26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4"} Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.930735 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerDied","Data":"d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c"} Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.930753 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerDied","Data":"8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379"} Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.932029 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lqr4m" event={"ID":"1950e421-2f64-4dd1-ba88-acb86d2921dc","Type":"ContainerDied","Data":"e154fc40bcd8315f6642f914f4f9d909fab966e74601ce81952aa6818add7bd0"} Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.932062 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e154fc40bcd8315f6642f914f4f9d909fab966e74601ce81952aa6818add7bd0" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.932043 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lqr4m" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.933158 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hk2db" event={"ID":"6cf26e17-d821-48a7-8b36-55b6c8616c22","Type":"ContainerDied","Data":"9e4b9509d89feb54a5bf39e6933f41df5e4b2d88b7c641ebde3e7898daf5464b"} Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.933184 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e4b9509d89feb54a5bf39e6933f41df5e4b2d88b7c641ebde3e7898daf5464b" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.933237 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hk2db" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.934133 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-97sg9" event={"ID":"5c987c62-7e44-493e-b3ad-a23e02081559","Type":"ContainerDied","Data":"07ceea0a43a6371709e9eaad689dd6b3e204b187c71de81c1c37c9e55cef89ce"} Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.934156 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-97sg9" Oct 12 20:41:51 crc kubenswrapper[4773]: I1012 20:41:51.934172 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07ceea0a43a6371709e9eaad689dd6b3e204b187c71de81c1c37c9e55cef89ce" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.534616 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.628347 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-479b-account-create-s6gs5"] Oct 12 20:41:53 crc kubenswrapper[4773]: E1012 20:41:53.628771 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="ceilometer-notification-agent" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.628788 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="ceilometer-notification-agent" Oct 12 20:41:53 crc kubenswrapper[4773]: E1012 20:41:53.628803 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c987c62-7e44-493e-b3ad-a23e02081559" containerName="mariadb-database-create" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.628809 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c987c62-7e44-493e-b3ad-a23e02081559" containerName="mariadb-database-create" Oct 12 20:41:53 crc kubenswrapper[4773]: E1012 20:41:53.628823 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="ceilometer-central-agent" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.628829 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="ceilometer-central-agent" Oct 12 20:41:53 crc kubenswrapper[4773]: E1012 20:41:53.628847 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1950e421-2f64-4dd1-ba88-acb86d2921dc" containerName="mariadb-database-create" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.628853 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1950e421-2f64-4dd1-ba88-acb86d2921dc" containerName="mariadb-database-create" Oct 12 20:41:53 crc kubenswrapper[4773]: E1012 20:41:53.628865 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="proxy-httpd" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.628870 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="proxy-httpd" Oct 12 20:41:53 crc kubenswrapper[4773]: E1012 20:41:53.628882 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf26e17-d821-48a7-8b36-55b6c8616c22" containerName="mariadb-database-create" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.628887 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf26e17-d821-48a7-8b36-55b6c8616c22" containerName="mariadb-database-create" Oct 12 20:41:53 crc kubenswrapper[4773]: E1012 20:41:53.628899 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="sg-core" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.628905 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="sg-core" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.629053 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf26e17-d821-48a7-8b36-55b6c8616c22" containerName="mariadb-database-create" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.629068 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="ceilometer-notification-agent" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.629077 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="ceilometer-central-agent" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.629088 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1950e421-2f64-4dd1-ba88-acb86d2921dc" containerName="mariadb-database-create" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.629097 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="sg-core" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.629108 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerName="proxy-httpd" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.629115 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c987c62-7e44-493e-b3ad-a23e02081559" containerName="mariadb-database-create" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.629691 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-479b-account-create-s6gs5" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.632089 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.632264 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-run-httpd\") pod \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.632346 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-scripts\") pod \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.632392 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-log-httpd\") pod \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.632411 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-sg-core-conf-yaml\") pod \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.632441 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-config-data\") pod \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.632499 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-combined-ca-bundle\") pod \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.632552 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9mzc\" (UniqueName: \"kubernetes.io/projected/46d8717f-68d5-4e5b-823c-ef3821bf40ef-kube-api-access-c9mzc\") pod \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\" (UID: \"46d8717f-68d5-4e5b-823c-ef3821bf40ef\") " Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.634294 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "46d8717f-68d5-4e5b-823c-ef3821bf40ef" (UID: "46d8717f-68d5-4e5b-823c-ef3821bf40ef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.640508 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-scripts" (OuterVolumeSpecName: "scripts") pod "46d8717f-68d5-4e5b-823c-ef3821bf40ef" (UID: "46d8717f-68d5-4e5b-823c-ef3821bf40ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.642598 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "46d8717f-68d5-4e5b-823c-ef3821bf40ef" (UID: "46d8717f-68d5-4e5b-823c-ef3821bf40ef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.649882 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-479b-account-create-s6gs5"] Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.659280 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d8717f-68d5-4e5b-823c-ef3821bf40ef-kube-api-access-c9mzc" (OuterVolumeSpecName: "kube-api-access-c9mzc") pod "46d8717f-68d5-4e5b-823c-ef3821bf40ef" (UID: "46d8717f-68d5-4e5b-823c-ef3821bf40ef"). InnerVolumeSpecName "kube-api-access-c9mzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.676210 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "46d8717f-68d5-4e5b-823c-ef3821bf40ef" (UID: "46d8717f-68d5-4e5b-823c-ef3821bf40ef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.737692 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-config-data" (OuterVolumeSpecName: "config-data") pod "46d8717f-68d5-4e5b-823c-ef3821bf40ef" (UID: "46d8717f-68d5-4e5b-823c-ef3821bf40ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.738458 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l74sc\" (UniqueName: \"kubernetes.io/projected/fa8513a2-9bb8-4b6a-892e-d77814d7ad9a-kube-api-access-l74sc\") pod \"nova-api-479b-account-create-s6gs5\" (UID: \"fa8513a2-9bb8-4b6a-892e-d77814d7ad9a\") " pod="openstack/nova-api-479b-account-create-s6gs5" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.738804 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.738828 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.738839 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46d8717f-68d5-4e5b-823c-ef3821bf40ef-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.738850 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.738860 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.738871 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9mzc\" (UniqueName: \"kubernetes.io/projected/46d8717f-68d5-4e5b-823c-ef3821bf40ef-kube-api-access-c9mzc\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.752356 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46d8717f-68d5-4e5b-823c-ef3821bf40ef" (UID: "46d8717f-68d5-4e5b-823c-ef3821bf40ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.840316 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l74sc\" (UniqueName: \"kubernetes.io/projected/fa8513a2-9bb8-4b6a-892e-d77814d7ad9a-kube-api-access-l74sc\") pod \"nova-api-479b-account-create-s6gs5\" (UID: \"fa8513a2-9bb8-4b6a-892e-d77814d7ad9a\") " pod="openstack/nova-api-479b-account-create-s6gs5" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.840375 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-49ba-account-create-5892n"] Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.840402 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d8717f-68d5-4e5b-823c-ef3821bf40ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.841769 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-49ba-account-create-5892n" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.845345 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.868247 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-49ba-account-create-5892n"] Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.877772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l74sc\" (UniqueName: \"kubernetes.io/projected/fa8513a2-9bb8-4b6a-892e-d77814d7ad9a-kube-api-access-l74sc\") pod \"nova-api-479b-account-create-s6gs5\" (UID: \"fa8513a2-9bb8-4b6a-892e-d77814d7ad9a\") " pod="openstack/nova-api-479b-account-create-s6gs5" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.942081 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psndp\" (UniqueName: \"kubernetes.io/projected/e1f59080-ff5e-44d1-a7a3-2b8642c8d883-kube-api-access-psndp\") pod \"nova-cell0-49ba-account-create-5892n\" (UID: \"e1f59080-ff5e-44d1-a7a3-2b8642c8d883\") " pod="openstack/nova-cell0-49ba-account-create-5892n" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.955697 4773 generic.go:334] "Generic (PLEG): container finished" podID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" containerID="2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a" exitCode=0 Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.955764 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerDied","Data":"2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a"} Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.955797 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46d8717f-68d5-4e5b-823c-ef3821bf40ef","Type":"ContainerDied","Data":"c2b819761e5b80b78c817c646be7ab09250ef08de334cae51c0015556412885a"} Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.955815 4773 scope.go:117] "RemoveContainer" containerID="26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4" Oct 12 20:41:53 crc kubenswrapper[4773]: I1012 20:41:53.955962 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.006404 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.009773 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.015887 4773 scope.go:117] "RemoveContainer" containerID="d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.019132 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.021915 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.026120 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.026288 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.032774 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-479b-account-create-s6gs5" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.040400 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.043464 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psndp\" (UniqueName: \"kubernetes.io/projected/e1f59080-ff5e-44d1-a7a3-2b8642c8d883-kube-api-access-psndp\") pod \"nova-cell0-49ba-account-create-5892n\" (UID: \"e1f59080-ff5e-44d1-a7a3-2b8642c8d883\") " pod="openstack/nova-cell0-49ba-account-create-5892n" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.062817 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-50ec-account-create-gk4vd"] Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.063945 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-50ec-account-create-gk4vd" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.066068 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.071466 4773 scope.go:117] "RemoveContainer" containerID="2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.074317 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psndp\" (UniqueName: \"kubernetes.io/projected/e1f59080-ff5e-44d1-a7a3-2b8642c8d883-kube-api-access-psndp\") pod \"nova-cell0-49ba-account-create-5892n\" (UID: \"e1f59080-ff5e-44d1-a7a3-2b8642c8d883\") " pod="openstack/nova-cell0-49ba-account-create-5892n" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.093400 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-50ec-account-create-gk4vd"] Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.130593 4773 scope.go:117] "RemoveContainer" containerID="8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.144672 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fbpx\" (UniqueName: \"kubernetes.io/projected/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-kube-api-access-6fbpx\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.144779 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-run-httpd\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.144817 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.144844 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.144905 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-log-httpd\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.145086 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-scripts\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.145253 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-config-data\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.171757 4773 scope.go:117] "RemoveContainer" containerID="26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4" Oct 12 20:41:54 crc kubenswrapper[4773]: E1012 20:41:54.174869 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4\": container with ID starting with 26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4 not found: ID does not exist" containerID="26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.174912 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4"} err="failed to get container status \"26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4\": rpc error: code = NotFound desc = could not find container \"26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4\": container with ID starting with 26481ff9ccbe41a3629cdc622830786fc9fccb9f27a7225bb87fabcbe453e0d4 not found: ID does not exist" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.174940 4773 scope.go:117] "RemoveContainer" containerID="d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c" Oct 12 20:41:54 crc kubenswrapper[4773]: E1012 20:41:54.176295 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c\": container with ID starting with d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c not found: ID does not exist" containerID="d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.176317 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c"} err="failed to get container status \"d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c\": rpc error: code = NotFound desc = could not find container \"d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c\": container with ID starting with d948aed408e2ac2324b574f26abb020313d8a4d76f9b9a1392db28ea2251ee3c not found: ID does not exist" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.176332 4773 scope.go:117] "RemoveContainer" containerID="2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a" Oct 12 20:41:54 crc kubenswrapper[4773]: E1012 20:41:54.178417 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a\": container with ID starting with 2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a not found: ID does not exist" containerID="2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.178441 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a"} err="failed to get container status \"2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a\": rpc error: code = NotFound desc = could not find container \"2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a\": container with ID starting with 2f250deda9891badde571cd0a910440276395ef5784d42db684de72500a6326a not found: ID does not exist" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.178461 4773 scope.go:117] "RemoveContainer" containerID="8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.178682 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-49ba-account-create-5892n" Oct 12 20:41:54 crc kubenswrapper[4773]: E1012 20:41:54.188235 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379\": container with ID starting with 8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379 not found: ID does not exist" containerID="8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.188590 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379"} err="failed to get container status \"8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379\": rpc error: code = NotFound desc = could not find container \"8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379\": container with ID starting with 8febdf86effbab2442456859b0e7006a525a8e5bd1c23b512476786f42279379 not found: ID does not exist" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.254165 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fbpx\" (UniqueName: \"kubernetes.io/projected/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-kube-api-access-6fbpx\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.254227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-run-httpd\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.254259 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.254288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.254304 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-log-httpd\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.254331 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-scripts\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.254360 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-config-data\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.254399 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzztc\" (UniqueName: \"kubernetes.io/projected/6d550930-e1d2-4785-bf9d-70959c014912-kube-api-access-lzztc\") pod \"nova-cell1-50ec-account-create-gk4vd\" (UID: \"6d550930-e1d2-4785-bf9d-70959c014912\") " pod="openstack/nova-cell1-50ec-account-create-gk4vd" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.255099 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-run-httpd\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.255271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-log-httpd\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.263919 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.264360 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.267547 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-config-data\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.275986 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-scripts\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.287559 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fbpx\" (UniqueName: \"kubernetes.io/projected/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-kube-api-access-6fbpx\") pod \"ceilometer-0\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.347043 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.359530 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzztc\" (UniqueName: \"kubernetes.io/projected/6d550930-e1d2-4785-bf9d-70959c014912-kube-api-access-lzztc\") pod \"nova-cell1-50ec-account-create-gk4vd\" (UID: \"6d550930-e1d2-4785-bf9d-70959c014912\") " pod="openstack/nova-cell1-50ec-account-create-gk4vd" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.383482 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzztc\" (UniqueName: \"kubernetes.io/projected/6d550930-e1d2-4785-bf9d-70959c014912-kube-api-access-lzztc\") pod \"nova-cell1-50ec-account-create-gk4vd\" (UID: \"6d550930-e1d2-4785-bf9d-70959c014912\") " pod="openstack/nova-cell1-50ec-account-create-gk4vd" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.387823 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-50ec-account-create-gk4vd" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.495654 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d8717f-68d5-4e5b-823c-ef3821bf40ef" path="/var/lib/kubelet/pods/46d8717f-68d5-4e5b-823c-ef3821bf40ef/volumes" Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.571003 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-479b-account-create-s6gs5"] Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.739688 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-50ec-account-create-gk4vd"] Oct 12 20:41:54 crc kubenswrapper[4773]: W1012 20:41:54.743530 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d550930_e1d2_4785_bf9d_70959c014912.slice/crio-507817497fd13a656e49aa222e1d8485e68a99ec40434171f57e79516691c2f8 WatchSource:0}: Error finding container 507817497fd13a656e49aa222e1d8485e68a99ec40434171f57e79516691c2f8: Status 404 returned error can't find the container with id 507817497fd13a656e49aa222e1d8485e68a99ec40434171f57e79516691c2f8 Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.748422 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-49ba-account-create-5892n"] Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.851220 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.967313 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-49ba-account-create-5892n" event={"ID":"e1f59080-ff5e-44d1-a7a3-2b8642c8d883","Type":"ContainerStarted","Data":"9b578cd27272d730d1f8c2ccafb272cfdc3edd0fb8886d0c89b0c48e4462ff98"} Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.969703 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerStarted","Data":"d906d89d6fc9be0079ea88a987b79746af1c07363eb5a1f465605d6b9ea196d5"} Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.970902 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-50ec-account-create-gk4vd" event={"ID":"6d550930-e1d2-4785-bf9d-70959c014912","Type":"ContainerStarted","Data":"507817497fd13a656e49aa222e1d8485e68a99ec40434171f57e79516691c2f8"} Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.972423 4773 generic.go:334] "Generic (PLEG): container finished" podID="fa8513a2-9bb8-4b6a-892e-d77814d7ad9a" containerID="09dcec4d009944e7a5b91bfa992f2039851a8a0bb30860d1759a3ee2a0996d3e" exitCode=0 Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.972468 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-479b-account-create-s6gs5" event={"ID":"fa8513a2-9bb8-4b6a-892e-d77814d7ad9a","Type":"ContainerDied","Data":"09dcec4d009944e7a5b91bfa992f2039851a8a0bb30860d1759a3ee2a0996d3e"} Oct 12 20:41:54 crc kubenswrapper[4773]: I1012 20:41:54.972493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-479b-account-create-s6gs5" event={"ID":"fa8513a2-9bb8-4b6a-892e-d77814d7ad9a","Type":"ContainerStarted","Data":"68bfe4641e9acdedf47429466913b5ff3536f988c50873d6455927f6b899a0f4"} Oct 12 20:41:55 crc kubenswrapper[4773]: I1012 20:41:55.982073 4773 generic.go:334] "Generic (PLEG): container finished" podID="6d550930-e1d2-4785-bf9d-70959c014912" containerID="af98ce080a1cdf129b9e5aed28ba6618f3e37052f0aa0db4e23a3fd0daa75ce7" exitCode=0 Oct 12 20:41:55 crc kubenswrapper[4773]: I1012 20:41:55.982374 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-50ec-account-create-gk4vd" event={"ID":"6d550930-e1d2-4785-bf9d-70959c014912","Type":"ContainerDied","Data":"af98ce080a1cdf129b9e5aed28ba6618f3e37052f0aa0db4e23a3fd0daa75ce7"} Oct 12 20:41:55 crc kubenswrapper[4773]: I1012 20:41:55.983935 4773 generic.go:334] "Generic (PLEG): container finished" podID="e1f59080-ff5e-44d1-a7a3-2b8642c8d883" containerID="6c94aa746bfcb6747dca20641f2630d1045e2c3e29d4b6c944d5a97b5887090e" exitCode=0 Oct 12 20:41:55 crc kubenswrapper[4773]: I1012 20:41:55.983975 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-49ba-account-create-5892n" event={"ID":"e1f59080-ff5e-44d1-a7a3-2b8642c8d883","Type":"ContainerDied","Data":"6c94aa746bfcb6747dca20641f2630d1045e2c3e29d4b6c944d5a97b5887090e"} Oct 12 20:41:55 crc kubenswrapper[4773]: I1012 20:41:55.985195 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerStarted","Data":"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff"} Oct 12 20:41:56 crc kubenswrapper[4773]: I1012 20:41:56.362315 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-479b-account-create-s6gs5" Oct 12 20:41:56 crc kubenswrapper[4773]: I1012 20:41:56.508263 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l74sc\" (UniqueName: \"kubernetes.io/projected/fa8513a2-9bb8-4b6a-892e-d77814d7ad9a-kube-api-access-l74sc\") pod \"fa8513a2-9bb8-4b6a-892e-d77814d7ad9a\" (UID: \"fa8513a2-9bb8-4b6a-892e-d77814d7ad9a\") " Oct 12 20:41:56 crc kubenswrapper[4773]: I1012 20:41:56.512645 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8513a2-9bb8-4b6a-892e-d77814d7ad9a-kube-api-access-l74sc" (OuterVolumeSpecName: "kube-api-access-l74sc") pod "fa8513a2-9bb8-4b6a-892e-d77814d7ad9a" (UID: "fa8513a2-9bb8-4b6a-892e-d77814d7ad9a"). InnerVolumeSpecName "kube-api-access-l74sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:56 crc kubenswrapper[4773]: I1012 20:41:56.610351 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l74sc\" (UniqueName: \"kubernetes.io/projected/fa8513a2-9bb8-4b6a-892e-d77814d7ad9a-kube-api-access-l74sc\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.001396 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-479b-account-create-s6gs5" event={"ID":"fa8513a2-9bb8-4b6a-892e-d77814d7ad9a","Type":"ContainerDied","Data":"68bfe4641e9acdedf47429466913b5ff3536f988c50873d6455927f6b899a0f4"} Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.001432 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68bfe4641e9acdedf47429466913b5ff3536f988c50873d6455927f6b899a0f4" Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.001474 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-479b-account-create-s6gs5" Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.007567 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerStarted","Data":"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a"} Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.276964 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-50ec-account-create-gk4vd" Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.405359 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-49ba-account-create-5892n" Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.422893 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzztc\" (UniqueName: \"kubernetes.io/projected/6d550930-e1d2-4785-bf9d-70959c014912-kube-api-access-lzztc\") pod \"6d550930-e1d2-4785-bf9d-70959c014912\" (UID: \"6d550930-e1d2-4785-bf9d-70959c014912\") " Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.427817 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d550930-e1d2-4785-bf9d-70959c014912-kube-api-access-lzztc" (OuterVolumeSpecName: "kube-api-access-lzztc") pod "6d550930-e1d2-4785-bf9d-70959c014912" (UID: "6d550930-e1d2-4785-bf9d-70959c014912"). InnerVolumeSpecName "kube-api-access-lzztc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.524942 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psndp\" (UniqueName: \"kubernetes.io/projected/e1f59080-ff5e-44d1-a7a3-2b8642c8d883-kube-api-access-psndp\") pod \"e1f59080-ff5e-44d1-a7a3-2b8642c8d883\" (UID: \"e1f59080-ff5e-44d1-a7a3-2b8642c8d883\") " Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.525499 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzztc\" (UniqueName: \"kubernetes.io/projected/6d550930-e1d2-4785-bf9d-70959c014912-kube-api-access-lzztc\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.527993 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f59080-ff5e-44d1-a7a3-2b8642c8d883-kube-api-access-psndp" (OuterVolumeSpecName: "kube-api-access-psndp") pod "e1f59080-ff5e-44d1-a7a3-2b8642c8d883" (UID: "e1f59080-ff5e-44d1-a7a3-2b8642c8d883"). InnerVolumeSpecName "kube-api-access-psndp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:41:57 crc kubenswrapper[4773]: I1012 20:41:57.627641 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psndp\" (UniqueName: \"kubernetes.io/projected/e1f59080-ff5e-44d1-a7a3-2b8642c8d883-kube-api-access-psndp\") on node \"crc\" DevicePath \"\"" Oct 12 20:41:58 crc kubenswrapper[4773]: I1012 20:41:58.018949 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-49ba-account-create-5892n" Oct 12 20:41:58 crc kubenswrapper[4773]: I1012 20:41:58.022833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-49ba-account-create-5892n" event={"ID":"e1f59080-ff5e-44d1-a7a3-2b8642c8d883","Type":"ContainerDied","Data":"9b578cd27272d730d1f8c2ccafb272cfdc3edd0fb8886d0c89b0c48e4462ff98"} Oct 12 20:41:58 crc kubenswrapper[4773]: I1012 20:41:58.022871 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b578cd27272d730d1f8c2ccafb272cfdc3edd0fb8886d0c89b0c48e4462ff98" Oct 12 20:41:58 crc kubenswrapper[4773]: I1012 20:41:58.026688 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerStarted","Data":"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f"} Oct 12 20:41:58 crc kubenswrapper[4773]: I1012 20:41:58.031635 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-50ec-account-create-gk4vd" Oct 12 20:41:58 crc kubenswrapper[4773]: I1012 20:41:58.031615 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-50ec-account-create-gk4vd" event={"ID":"6d550930-e1d2-4785-bf9d-70959c014912","Type":"ContainerDied","Data":"507817497fd13a656e49aa222e1d8485e68a99ec40434171f57e79516691c2f8"} Oct 12 20:41:58 crc kubenswrapper[4773]: I1012 20:41:58.031742 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507817497fd13a656e49aa222e1d8485e68a99ec40434171f57e79516691c2f8" Oct 12 20:41:58 crc kubenswrapper[4773]: I1012 20:41:58.669025 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:41:58 crc kubenswrapper[4773]: I1012 20:41:58.669300 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.040234 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerStarted","Data":"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37"} Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.041269 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.071970 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.88057058 podStartE2EDuration="6.071949504s" podCreationTimestamp="2025-10-12 20:41:53 +0000 UTC" firstStartedPulling="2025-10-12 20:41:54.867517092 +0000 UTC m=+1063.103815652" lastFinishedPulling="2025-10-12 20:41:58.058896016 +0000 UTC m=+1066.295194576" observedRunningTime="2025-10-12 20:41:59.063427617 +0000 UTC m=+1067.299726177" watchObservedRunningTime="2025-10-12 20:41:59.071949504 +0000 UTC m=+1067.308248054" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.164190 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkjgn"] Oct 12 20:41:59 crc kubenswrapper[4773]: E1012 20:41:59.164673 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f59080-ff5e-44d1-a7a3-2b8642c8d883" containerName="mariadb-account-create" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.164686 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f59080-ff5e-44d1-a7a3-2b8642c8d883" containerName="mariadb-account-create" Oct 12 20:41:59 crc kubenswrapper[4773]: E1012 20:41:59.164704 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8513a2-9bb8-4b6a-892e-d77814d7ad9a" containerName="mariadb-account-create" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.164710 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8513a2-9bb8-4b6a-892e-d77814d7ad9a" containerName="mariadb-account-create" Oct 12 20:41:59 crc kubenswrapper[4773]: E1012 20:41:59.164721 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d550930-e1d2-4785-bf9d-70959c014912" containerName="mariadb-account-create" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.164727 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d550930-e1d2-4785-bf9d-70959c014912" containerName="mariadb-account-create" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.164909 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f59080-ff5e-44d1-a7a3-2b8642c8d883" containerName="mariadb-account-create" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.164924 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d550930-e1d2-4785-bf9d-70959c014912" containerName="mariadb-account-create" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.164932 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8513a2-9bb8-4b6a-892e-d77814d7ad9a" containerName="mariadb-account-create" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.165454 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.167604 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.167879 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.168056 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tcrrv" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.183817 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkjgn"] Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.260921 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92xgq\" (UniqueName: \"kubernetes.io/projected/3655b0f6-2e88-4b9e-b836-18633f2f0535-kube-api-access-92xgq\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.261287 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.261421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-scripts\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.261555 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-config-data\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.363150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.363427 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-scripts\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.363523 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-config-data\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.363694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92xgq\" (UniqueName: \"kubernetes.io/projected/3655b0f6-2e88-4b9e-b836-18633f2f0535-kube-api-access-92xgq\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.373872 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.377355 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-scripts\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.379696 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-config-data\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.383451 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92xgq\" (UniqueName: \"kubernetes.io/projected/3655b0f6-2e88-4b9e-b836-18633f2f0535-kube-api-access-92xgq\") pod \"nova-cell0-conductor-db-sync-kkjgn\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.480481 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.961615 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkjgn"] Oct 12 20:41:59 crc kubenswrapper[4773]: I1012 20:41:59.966890 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:00 crc kubenswrapper[4773]: I1012 20:42:00.047974 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkjgn" event={"ID":"3655b0f6-2e88-4b9e-b836-18633f2f0535","Type":"ContainerStarted","Data":"f77b498bbfd93c64fc76fad3878e6a2654b6a6f82eecf187466561632d15ec0d"} Oct 12 20:42:00 crc kubenswrapper[4773]: I1012 20:42:00.698232 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-d44cb954d-ggb9c" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.140:9696/\": dial tcp 10.217.0.140:9696: connect: connection refused" Oct 12 20:42:01 crc kubenswrapper[4773]: I1012 20:42:01.065238 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="ceilometer-central-agent" containerID="cri-o://6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff" gracePeriod=30 Oct 12 20:42:01 crc kubenswrapper[4773]: I1012 20:42:01.065503 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="proxy-httpd" containerID="cri-o://c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37" gracePeriod=30 Oct 12 20:42:01 crc kubenswrapper[4773]: I1012 20:42:01.065683 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="ceilometer-notification-agent" containerID="cri-o://3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a" gracePeriod=30 Oct 12 20:42:01 crc kubenswrapper[4773]: I1012 20:42:01.065753 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="sg-core" containerID="cri-o://380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f" gracePeriod=30 Oct 12 20:42:01 crc kubenswrapper[4773]: I1012 20:42:01.969421 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080301 4773 generic.go:334] "Generic (PLEG): container finished" podID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerID="c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37" exitCode=0 Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080342 4773 generic.go:334] "Generic (PLEG): container finished" podID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerID="380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f" exitCode=2 Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080357 4773 generic.go:334] "Generic (PLEG): container finished" podID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerID="3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a" exitCode=0 Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080369 4773 generic.go:334] "Generic (PLEG): container finished" podID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerID="6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff" exitCode=0 Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080388 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerDied","Data":"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37"} Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080549 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerDied","Data":"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f"} Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080576 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerDied","Data":"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a"} Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080591 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerDied","Data":"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff"} Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080604 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4eb7f353-b9e5-48d8-9606-f1bce9649ff7","Type":"ContainerDied","Data":"d906d89d6fc9be0079ea88a987b79746af1c07363eb5a1f465605d6b9ea196d5"} Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.080627 4773 scope.go:117] "RemoveContainer" containerID="c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.107686 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-scripts\") pod \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.108057 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-sg-core-conf-yaml\") pod \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.108126 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-run-httpd\") pod \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.108153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-log-httpd\") pod \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.108177 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-combined-ca-bundle\") pod \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.108211 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fbpx\" (UniqueName: \"kubernetes.io/projected/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-kube-api-access-6fbpx\") pod \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.108265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-config-data\") pod \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\" (UID: \"4eb7f353-b9e5-48d8-9606-f1bce9649ff7\") " Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.108854 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4eb7f353-b9e5-48d8-9606-f1bce9649ff7" (UID: "4eb7f353-b9e5-48d8-9606-f1bce9649ff7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.109057 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4eb7f353-b9e5-48d8-9606-f1bce9649ff7" (UID: "4eb7f353-b9e5-48d8-9606-f1bce9649ff7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.112782 4773 scope.go:117] "RemoveContainer" containerID="380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.113936 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-scripts" (OuterVolumeSpecName: "scripts") pod "4eb7f353-b9e5-48d8-9606-f1bce9649ff7" (UID: "4eb7f353-b9e5-48d8-9606-f1bce9649ff7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.113993 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-kube-api-access-6fbpx" (OuterVolumeSpecName: "kube-api-access-6fbpx") pod "4eb7f353-b9e5-48d8-9606-f1bce9649ff7" (UID: "4eb7f353-b9e5-48d8-9606-f1bce9649ff7"). InnerVolumeSpecName "kube-api-access-6fbpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.135716 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4eb7f353-b9e5-48d8-9606-f1bce9649ff7" (UID: "4eb7f353-b9e5-48d8-9606-f1bce9649ff7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.136469 4773 scope.go:117] "RemoveContainer" containerID="3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.159746 4773 scope.go:117] "RemoveContainer" containerID="6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.188048 4773 scope.go:117] "RemoveContainer" containerID="c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37" Oct 12 20:42:02 crc kubenswrapper[4773]: E1012 20:42:02.189402 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37\": container with ID starting with c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37 not found: ID does not exist" containerID="c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.189433 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37"} err="failed to get container status \"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37\": rpc error: code = NotFound desc = could not find container \"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37\": container with ID starting with c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37 not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.189452 4773 scope.go:117] "RemoveContainer" containerID="380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.194074 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4eb7f353-b9e5-48d8-9606-f1bce9649ff7" (UID: "4eb7f353-b9e5-48d8-9606-f1bce9649ff7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:02 crc kubenswrapper[4773]: E1012 20:42:02.197986 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f\": container with ID starting with 380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f not found: ID does not exist" containerID="380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.198016 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f"} err="failed to get container status \"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f\": rpc error: code = NotFound desc = could not find container \"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f\": container with ID starting with 380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.198031 4773 scope.go:117] "RemoveContainer" containerID="3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a" Oct 12 20:42:02 crc kubenswrapper[4773]: E1012 20:42:02.200812 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a\": container with ID starting with 3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a not found: ID does not exist" containerID="3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.200837 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a"} err="failed to get container status \"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a\": rpc error: code = NotFound desc = could not find container \"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a\": container with ID starting with 3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.200850 4773 scope.go:117] "RemoveContainer" containerID="6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff" Oct 12 20:42:02 crc kubenswrapper[4773]: E1012 20:42:02.206038 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff\": container with ID starting with 6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff not found: ID does not exist" containerID="6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.206062 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff"} err="failed to get container status \"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff\": rpc error: code = NotFound desc = could not find container \"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff\": container with ID starting with 6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.206076 4773 scope.go:117] "RemoveContainer" containerID="c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.207618 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37"} err="failed to get container status \"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37\": rpc error: code = NotFound desc = could not find container \"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37\": container with ID starting with c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37 not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.207642 4773 scope.go:117] "RemoveContainer" containerID="380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.209680 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fbpx\" (UniqueName: \"kubernetes.io/projected/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-kube-api-access-6fbpx\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.209702 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.209710 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.209721 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.209740 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.209750 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.216551 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f"} err="failed to get container status \"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f\": rpc error: code = NotFound desc = could not find container \"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f\": container with ID starting with 380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.216620 4773 scope.go:117] "RemoveContainer" containerID="3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.216944 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a"} err="failed to get container status \"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a\": rpc error: code = NotFound desc = could not find container \"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a\": container with ID starting with 3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.216967 4773 scope.go:117] "RemoveContainer" containerID="6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.217304 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff"} err="failed to get container status \"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff\": rpc error: code = NotFound desc = could not find container \"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff\": container with ID starting with 6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.217346 4773 scope.go:117] "RemoveContainer" containerID="c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.217689 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37"} err="failed to get container status \"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37\": rpc error: code = NotFound desc = could not find container \"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37\": container with ID starting with c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37 not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.217708 4773 scope.go:117] "RemoveContainer" containerID="380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.218089 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f"} err="failed to get container status \"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f\": rpc error: code = NotFound desc = could not find container \"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f\": container with ID starting with 380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.218107 4773 scope.go:117] "RemoveContainer" containerID="3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.218305 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a"} err="failed to get container status \"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a\": rpc error: code = NotFound desc = could not find container \"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a\": container with ID starting with 3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.218322 4773 scope.go:117] "RemoveContainer" containerID="6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.218628 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff"} err="failed to get container status \"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff\": rpc error: code = NotFound desc = could not find container \"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff\": container with ID starting with 6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.218646 4773 scope.go:117] "RemoveContainer" containerID="c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.218972 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37"} err="failed to get container status \"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37\": rpc error: code = NotFound desc = could not find container \"c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37\": container with ID starting with c020bd47bc9f57a2889fa5888ba394b3096c3a1b823bba4887136091d9c0bd37 not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.218992 4773 scope.go:117] "RemoveContainer" containerID="380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.219227 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f"} err="failed to get container status \"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f\": rpc error: code = NotFound desc = could not find container \"380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f\": container with ID starting with 380dc907dde98ca0ae23d9fdf90ec8cd1d297e531ee285a4bbeae3dfdccb805f not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.219245 4773 scope.go:117] "RemoveContainer" containerID="3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.221950 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a"} err="failed to get container status \"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a\": rpc error: code = NotFound desc = could not find container \"3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a\": container with ID starting with 3d7c547089265570c2e02cd4f07bbe95902aa24af3c14adbbce2892ab760ce5a not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.222003 4773 scope.go:117] "RemoveContainer" containerID="6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.224610 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff"} err="failed to get container status \"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff\": rpc error: code = NotFound desc = could not find container \"6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff\": container with ID starting with 6c140dc1115d4261202bce22798e8fde4409cbbba2568353710c385f01c53bff not found: ID does not exist" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.249846 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-config-data" (OuterVolumeSpecName: "config-data") pod "4eb7f353-b9e5-48d8-9606-f1bce9649ff7" (UID: "4eb7f353-b9e5-48d8-9606-f1bce9649ff7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.312930 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7f353-b9e5-48d8-9606-f1bce9649ff7-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.418643 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.428391 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.439622 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:02 crc kubenswrapper[4773]: E1012 20:42:02.440024 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="ceilometer-notification-agent" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.440046 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="ceilometer-notification-agent" Oct 12 20:42:02 crc kubenswrapper[4773]: E1012 20:42:02.440071 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="ceilometer-central-agent" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.440082 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="ceilometer-central-agent" Oct 12 20:42:02 crc kubenswrapper[4773]: E1012 20:42:02.440106 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="sg-core" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.440117 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="sg-core" Oct 12 20:42:02 crc kubenswrapper[4773]: E1012 20:42:02.440138 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="proxy-httpd" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.440147 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="proxy-httpd" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.440349 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="ceilometer-notification-agent" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.440370 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="proxy-httpd" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.440397 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="sg-core" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.440413 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" containerName="ceilometer-central-agent" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.444397 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.453691 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.453839 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.470233 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.495166 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb7f353-b9e5-48d8-9606-f1bce9649ff7" path="/var/lib/kubelet/pods/4eb7f353-b9e5-48d8-9606-f1bce9649ff7/volumes" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.515333 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-552p9\" (UniqueName: \"kubernetes.io/projected/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-kube-api-access-552p9\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.515373 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-scripts\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.515409 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-run-httpd\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.515429 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.515484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-config-data\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.515537 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-log-httpd\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.515559 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.617428 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-log-httpd\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.617471 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.617520 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-552p9\" (UniqueName: \"kubernetes.io/projected/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-kube-api-access-552p9\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.617539 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-scripts\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.617581 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-run-httpd\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.617601 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.617673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-config-data\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.618775 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-run-httpd\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.619882 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-log-httpd\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.624147 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-scripts\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.624306 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-config-data\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.629326 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.639852 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-552p9\" (UniqueName: \"kubernetes.io/projected/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-kube-api-access-552p9\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.649436 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " pod="openstack/ceilometer-0" Oct 12 20:42:02 crc kubenswrapper[4773]: I1012 20:42:02.760202 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:03 crc kubenswrapper[4773]: I1012 20:42:03.238710 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.130940 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerStarted","Data":"e0a24d1ab3d0d74ca2ffec4e19ec0f94845ee042ede518011fe486f74e3210b3"} Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.131390 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerStarted","Data":"26fecc9420a48c627e821e543599ee1e229dcc704b20bb2d2713447358d538b6"} Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.165839 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d44cb954d-ggb9c_921796c7-7ab3-4924-bd37-a998ccfab6e3/neutron-api/0.log" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.165900 4773 generic.go:334] "Generic (PLEG): container finished" podID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerID="0e644269c022118a0d1ffaf53214b01fa7fec22645564c5b06c1b23783468950" exitCode=137 Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.165955 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d44cb954d-ggb9c" event={"ID":"921796c7-7ab3-4924-bd37-a998ccfab6e3","Type":"ContainerDied","Data":"0e644269c022118a0d1ffaf53214b01fa7fec22645564c5b06c1b23783468950"} Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.505191 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d44cb954d-ggb9c_921796c7-7ab3-4924-bd37-a998ccfab6e3/neutron-api/0.log" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.505516 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.550868 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-ovndb-tls-certs\") pod \"921796c7-7ab3-4924-bd37-a998ccfab6e3\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.550920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f6xm\" (UniqueName: \"kubernetes.io/projected/921796c7-7ab3-4924-bd37-a998ccfab6e3-kube-api-access-5f6xm\") pod \"921796c7-7ab3-4924-bd37-a998ccfab6e3\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.550975 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-httpd-config\") pod \"921796c7-7ab3-4924-bd37-a998ccfab6e3\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.550998 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-combined-ca-bundle\") pod \"921796c7-7ab3-4924-bd37-a998ccfab6e3\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.551130 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-config\") pod \"921796c7-7ab3-4924-bd37-a998ccfab6e3\" (UID: \"921796c7-7ab3-4924-bd37-a998ccfab6e3\") " Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.558585 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921796c7-7ab3-4924-bd37-a998ccfab6e3-kube-api-access-5f6xm" (OuterVolumeSpecName: "kube-api-access-5f6xm") pod "921796c7-7ab3-4924-bd37-a998ccfab6e3" (UID: "921796c7-7ab3-4924-bd37-a998ccfab6e3"). InnerVolumeSpecName "kube-api-access-5f6xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.558723 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "921796c7-7ab3-4924-bd37-a998ccfab6e3" (UID: "921796c7-7ab3-4924-bd37-a998ccfab6e3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.631134 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-config" (OuterVolumeSpecName: "config") pod "921796c7-7ab3-4924-bd37-a998ccfab6e3" (UID: "921796c7-7ab3-4924-bd37-a998ccfab6e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.653396 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.653425 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f6xm\" (UniqueName: \"kubernetes.io/projected/921796c7-7ab3-4924-bd37-a998ccfab6e3-kube-api-access-5f6xm\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.653436 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.655832 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "921796c7-7ab3-4924-bd37-a998ccfab6e3" (UID: "921796c7-7ab3-4924-bd37-a998ccfab6e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.680494 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "921796c7-7ab3-4924-bd37-a998ccfab6e3" (UID: "921796c7-7ab3-4924-bd37-a998ccfab6e3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.755138 4773 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:04 crc kubenswrapper[4773]: I1012 20:42:04.755164 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/921796c7-7ab3-4924-bd37-a998ccfab6e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:05 crc kubenswrapper[4773]: I1012 20:42:05.184837 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerStarted","Data":"30948c4a785e10b8642b0cee44ba942c39ee2d253368350e056ad0e5dcec4aec"} Oct 12 20:42:05 crc kubenswrapper[4773]: I1012 20:42:05.188550 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d44cb954d-ggb9c_921796c7-7ab3-4924-bd37-a998ccfab6e3/neutron-api/0.log" Oct 12 20:42:05 crc kubenswrapper[4773]: I1012 20:42:05.188608 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d44cb954d-ggb9c" event={"ID":"921796c7-7ab3-4924-bd37-a998ccfab6e3","Type":"ContainerDied","Data":"902cc020dd0059f48e31bd22915e913626381fac66c2a518c3aec59e3bd859cd"} Oct 12 20:42:05 crc kubenswrapper[4773]: I1012 20:42:05.188647 4773 scope.go:117] "RemoveContainer" containerID="961e997ad0ae2e634e480f584dc1b94bee53f1bac372a46f6096fdb85b964912" Oct 12 20:42:05 crc kubenswrapper[4773]: I1012 20:42:05.188669 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d44cb954d-ggb9c" Oct 12 20:42:05 crc kubenswrapper[4773]: I1012 20:42:05.275037 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d44cb954d-ggb9c"] Oct 12 20:42:05 crc kubenswrapper[4773]: I1012 20:42:05.282559 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d44cb954d-ggb9c"] Oct 12 20:42:05 crc kubenswrapper[4773]: I1012 20:42:05.349778 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:06 crc kubenswrapper[4773]: I1012 20:42:06.199890 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerStarted","Data":"ae9b4b3ef47961e294cfc2bb734826fd49a09af2792be53c1bf3eaa0d8e0353c"} Oct 12 20:42:06 crc kubenswrapper[4773]: I1012 20:42:06.491688 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" path="/var/lib/kubelet/pods/921796c7-7ab3-4924-bd37-a998ccfab6e3/volumes" Oct 12 20:42:10 crc kubenswrapper[4773]: I1012 20:42:10.744396 4773 scope.go:117] "RemoveContainer" containerID="0e644269c022118a0d1ffaf53214b01fa7fec22645564c5b06c1b23783468950" Oct 12 20:42:12 crc kubenswrapper[4773]: I1012 20:42:12.279916 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkjgn" event={"ID":"3655b0f6-2e88-4b9e-b836-18633f2f0535","Type":"ContainerStarted","Data":"e0a97b4c2a394f1f9bb07b083f66f1289ba13399d3951eedc7d428b3fa435a3a"} Oct 12 20:42:12 crc kubenswrapper[4773]: I1012 20:42:12.308453 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-kkjgn" podStartSLOduration=2.230792349 podStartE2EDuration="13.308435167s" podCreationTimestamp="2025-10-12 20:41:59 +0000 UTC" firstStartedPulling="2025-10-12 20:41:59.981565726 +0000 UTC m=+1068.217864286" lastFinishedPulling="2025-10-12 20:42:11.059208544 +0000 UTC m=+1079.295507104" observedRunningTime="2025-10-12 20:42:12.29414163 +0000 UTC m=+1080.530440190" watchObservedRunningTime="2025-10-12 20:42:12.308435167 +0000 UTC m=+1080.544733727" Oct 12 20:42:14 crc kubenswrapper[4773]: I1012 20:42:14.297849 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerStarted","Data":"4f5eac2db6734e81520d33d370d98e53e1c9c80ee9e39479c083413939d230b5"} Oct 12 20:42:14 crc kubenswrapper[4773]: I1012 20:42:14.298509 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 20:42:14 crc kubenswrapper[4773]: I1012 20:42:14.298171 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="proxy-httpd" containerID="cri-o://4f5eac2db6734e81520d33d370d98e53e1c9c80ee9e39479c083413939d230b5" gracePeriod=30 Oct 12 20:42:14 crc kubenswrapper[4773]: I1012 20:42:14.298181 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="ceilometer-notification-agent" containerID="cri-o://30948c4a785e10b8642b0cee44ba942c39ee2d253368350e056ad0e5dcec4aec" gracePeriod=30 Oct 12 20:42:14 crc kubenswrapper[4773]: I1012 20:42:14.298189 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="sg-core" containerID="cri-o://ae9b4b3ef47961e294cfc2bb734826fd49a09af2792be53c1bf3eaa0d8e0353c" gracePeriod=30 Oct 12 20:42:14 crc kubenswrapper[4773]: I1012 20:42:14.298090 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="ceilometer-central-agent" containerID="cri-o://e0a24d1ab3d0d74ca2ffec4e19ec0f94845ee042ede518011fe486f74e3210b3" gracePeriod=30 Oct 12 20:42:14 crc kubenswrapper[4773]: I1012 20:42:14.332657 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.456796557 podStartE2EDuration="12.332631429s" podCreationTimestamp="2025-10-12 20:42:02 +0000 UTC" firstStartedPulling="2025-10-12 20:42:03.245100376 +0000 UTC m=+1071.481398936" lastFinishedPulling="2025-10-12 20:42:13.120935248 +0000 UTC m=+1081.357233808" observedRunningTime="2025-10-12 20:42:14.322798486 +0000 UTC m=+1082.559097086" watchObservedRunningTime="2025-10-12 20:42:14.332631429 +0000 UTC m=+1082.568930019" Oct 12 20:42:15 crc kubenswrapper[4773]: I1012 20:42:15.307955 4773 generic.go:334] "Generic (PLEG): container finished" podID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerID="4f5eac2db6734e81520d33d370d98e53e1c9c80ee9e39479c083413939d230b5" exitCode=0 Oct 12 20:42:15 crc kubenswrapper[4773]: I1012 20:42:15.308000 4773 generic.go:334] "Generic (PLEG): container finished" podID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerID="ae9b4b3ef47961e294cfc2bb734826fd49a09af2792be53c1bf3eaa0d8e0353c" exitCode=2 Oct 12 20:42:15 crc kubenswrapper[4773]: I1012 20:42:15.308016 4773 generic.go:334] "Generic (PLEG): container finished" podID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerID="e0a24d1ab3d0d74ca2ffec4e19ec0f94845ee042ede518011fe486f74e3210b3" exitCode=0 Oct 12 20:42:15 crc kubenswrapper[4773]: I1012 20:42:15.308044 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerDied","Data":"4f5eac2db6734e81520d33d370d98e53e1c9c80ee9e39479c083413939d230b5"} Oct 12 20:42:15 crc kubenswrapper[4773]: I1012 20:42:15.308082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerDied","Data":"ae9b4b3ef47961e294cfc2bb734826fd49a09af2792be53c1bf3eaa0d8e0353c"} Oct 12 20:42:15 crc kubenswrapper[4773]: I1012 20:42:15.308100 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerDied","Data":"e0a24d1ab3d0d74ca2ffec4e19ec0f94845ee042ede518011fe486f74e3210b3"} Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.343784 4773 generic.go:334] "Generic (PLEG): container finished" podID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerID="30948c4a785e10b8642b0cee44ba942c39ee2d253368350e056ad0e5dcec4aec" exitCode=0 Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.344119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerDied","Data":"30948c4a785e10b8642b0cee44ba942c39ee2d253368350e056ad0e5dcec4aec"} Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.537748 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.714627 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-config-data\") pod \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.714690 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-combined-ca-bundle\") pod \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.714767 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-log-httpd\") pod \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.714798 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-sg-core-conf-yaml\") pod \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.714893 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-scripts\") pod \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.714917 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-552p9\" (UniqueName: \"kubernetes.io/projected/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-kube-api-access-552p9\") pod \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.714947 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-run-httpd\") pod \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\" (UID: \"553bd0db-fa40-47ad-b1f1-e4d7184dcb60\") " Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.715659 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "553bd0db-fa40-47ad-b1f1-e4d7184dcb60" (UID: "553bd0db-fa40-47ad-b1f1-e4d7184dcb60"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.716989 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "553bd0db-fa40-47ad-b1f1-e4d7184dcb60" (UID: "553bd0db-fa40-47ad-b1f1-e4d7184dcb60"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.720604 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-kube-api-access-552p9" (OuterVolumeSpecName: "kube-api-access-552p9") pod "553bd0db-fa40-47ad-b1f1-e4d7184dcb60" (UID: "553bd0db-fa40-47ad-b1f1-e4d7184dcb60"). InnerVolumeSpecName "kube-api-access-552p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.720786 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-scripts" (OuterVolumeSpecName: "scripts") pod "553bd0db-fa40-47ad-b1f1-e4d7184dcb60" (UID: "553bd0db-fa40-47ad-b1f1-e4d7184dcb60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.748750 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "553bd0db-fa40-47ad-b1f1-e4d7184dcb60" (UID: "553bd0db-fa40-47ad-b1f1-e4d7184dcb60"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.787345 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "553bd0db-fa40-47ad-b1f1-e4d7184dcb60" (UID: "553bd0db-fa40-47ad-b1f1-e4d7184dcb60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.809814 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-config-data" (OuterVolumeSpecName: "config-data") pod "553bd0db-fa40-47ad-b1f1-e4d7184dcb60" (UID: "553bd0db-fa40-47ad-b1f1-e4d7184dcb60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.817409 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.817438 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-552p9\" (UniqueName: \"kubernetes.io/projected/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-kube-api-access-552p9\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.817449 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.817459 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.817467 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.817475 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:16 crc kubenswrapper[4773]: I1012 20:42:16.817483 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/553bd0db-fa40-47ad-b1f1-e4d7184dcb60-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.354874 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"553bd0db-fa40-47ad-b1f1-e4d7184dcb60","Type":"ContainerDied","Data":"26fecc9420a48c627e821e543599ee1e229dcc704b20bb2d2713447358d538b6"} Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.354924 4773 scope.go:117] "RemoveContainer" containerID="4f5eac2db6734e81520d33d370d98e53e1c9c80ee9e39479c083413939d230b5" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.354986 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.372644 4773 scope.go:117] "RemoveContainer" containerID="ae9b4b3ef47961e294cfc2bb734826fd49a09af2792be53c1bf3eaa0d8e0353c" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.399924 4773 scope.go:117] "RemoveContainer" containerID="30948c4a785e10b8642b0cee44ba942c39ee2d253368350e056ad0e5dcec4aec" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.402640 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.409293 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.422870 4773 scope.go:117] "RemoveContainer" containerID="e0a24d1ab3d0d74ca2ffec4e19ec0f94845ee042ede518011fe486f74e3210b3" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.426741 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:17 crc kubenswrapper[4773]: E1012 20:42:17.427274 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerName="neutron-api" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.427364 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerName="neutron-api" Oct 12 20:42:17 crc kubenswrapper[4773]: E1012 20:42:17.427450 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="ceilometer-central-agent" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.427523 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="ceilometer-central-agent" Oct 12 20:42:17 crc kubenswrapper[4773]: E1012 20:42:17.427611 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="ceilometer-notification-agent" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.427694 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="ceilometer-notification-agent" Oct 12 20:42:17 crc kubenswrapper[4773]: E1012 20:42:17.427812 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="sg-core" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.427869 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="sg-core" Oct 12 20:42:17 crc kubenswrapper[4773]: E1012 20:42:17.427936 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerName="neutron-httpd" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.427998 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerName="neutron-httpd" Oct 12 20:42:17 crc kubenswrapper[4773]: E1012 20:42:17.428064 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="proxy-httpd" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.428126 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="proxy-httpd" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.428536 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="proxy-httpd" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.428602 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="ceilometer-notification-agent" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.428655 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="sg-core" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.428713 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" containerName="ceilometer-central-agent" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.428803 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerName="neutron-httpd" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.428891 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="921796c7-7ab3-4924-bd37-a998ccfab6e3" containerName="neutron-api" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.430757 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.438453 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.438649 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.439650 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.527582 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.527638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-run-httpd\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.527665 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-config-data\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.527911 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-log-httpd\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.527976 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-scripts\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.528053 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.528126 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55575\" (UniqueName: \"kubernetes.io/projected/121ee448-a59b-45ec-9e07-69f8d4e22518-kube-api-access-55575\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.629480 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-log-httpd\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.629538 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-scripts\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.629617 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.629661 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55575\" (UniqueName: \"kubernetes.io/projected/121ee448-a59b-45ec-9e07-69f8d4e22518-kube-api-access-55575\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.629765 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.629785 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-run-httpd\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.629822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-config-data\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.631247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-log-httpd\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.633093 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-run-httpd\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.634759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.635155 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-scripts\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.638062 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-config-data\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.642130 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.652782 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55575\" (UniqueName: \"kubernetes.io/projected/121ee448-a59b-45ec-9e07-69f8d4e22518-kube-api-access-55575\") pod \"ceilometer-0\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " pod="openstack/ceilometer-0" Oct 12 20:42:17 crc kubenswrapper[4773]: I1012 20:42:17.755994 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:18 crc kubenswrapper[4773]: W1012 20:42:18.209106 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod121ee448_a59b_45ec_9e07_69f8d4e22518.slice/crio-14fbf742d25ec4d19f9b56056ddb47a325d22b9a1b0a66a426c9e218ec17302d WatchSource:0}: Error finding container 14fbf742d25ec4d19f9b56056ddb47a325d22b9a1b0a66a426c9e218ec17302d: Status 404 returned error can't find the container with id 14fbf742d25ec4d19f9b56056ddb47a325d22b9a1b0a66a426c9e218ec17302d Oct 12 20:42:18 crc kubenswrapper[4773]: I1012 20:42:18.209998 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:18 crc kubenswrapper[4773]: I1012 20:42:18.363967 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerStarted","Data":"14fbf742d25ec4d19f9b56056ddb47a325d22b9a1b0a66a426c9e218ec17302d"} Oct 12 20:42:18 crc kubenswrapper[4773]: I1012 20:42:18.490351 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="553bd0db-fa40-47ad-b1f1-e4d7184dcb60" path="/var/lib/kubelet/pods/553bd0db-fa40-47ad-b1f1-e4d7184dcb60/volumes" Oct 12 20:42:19 crc kubenswrapper[4773]: I1012 20:42:19.378002 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerStarted","Data":"93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e"} Oct 12 20:42:20 crc kubenswrapper[4773]: I1012 20:42:20.389343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerStarted","Data":"9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e"} Oct 12 20:42:20 crc kubenswrapper[4773]: I1012 20:42:20.389637 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerStarted","Data":"089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3"} Oct 12 20:42:21 crc kubenswrapper[4773]: I1012 20:42:21.398362 4773 generic.go:334] "Generic (PLEG): container finished" podID="3655b0f6-2e88-4b9e-b836-18633f2f0535" containerID="e0a97b4c2a394f1f9bb07b083f66f1289ba13399d3951eedc7d428b3fa435a3a" exitCode=0 Oct 12 20:42:21 crc kubenswrapper[4773]: I1012 20:42:21.398460 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkjgn" event={"ID":"3655b0f6-2e88-4b9e-b836-18633f2f0535","Type":"ContainerDied","Data":"e0a97b4c2a394f1f9bb07b083f66f1289ba13399d3951eedc7d428b3fa435a3a"} Oct 12 20:42:21 crc kubenswrapper[4773]: I1012 20:42:21.407709 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerStarted","Data":"1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d"} Oct 12 20:42:21 crc kubenswrapper[4773]: I1012 20:42:21.408003 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 20:42:21 crc kubenswrapper[4773]: I1012 20:42:21.438929 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.502219893 podStartE2EDuration="4.438911916s" podCreationTimestamp="2025-10-12 20:42:17 +0000 UTC" firstStartedPulling="2025-10-12 20:42:18.212174798 +0000 UTC m=+1086.448473358" lastFinishedPulling="2025-10-12 20:42:21.148866791 +0000 UTC m=+1089.385165381" observedRunningTime="2025-10-12 20:42:21.431331095 +0000 UTC m=+1089.667629655" watchObservedRunningTime="2025-10-12 20:42:21.438911916 +0000 UTC m=+1089.675210476" Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.793039 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.844685 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92xgq\" (UniqueName: \"kubernetes.io/projected/3655b0f6-2e88-4b9e-b836-18633f2f0535-kube-api-access-92xgq\") pod \"3655b0f6-2e88-4b9e-b836-18633f2f0535\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.844777 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-combined-ca-bundle\") pod \"3655b0f6-2e88-4b9e-b836-18633f2f0535\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.844845 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-scripts\") pod \"3655b0f6-2e88-4b9e-b836-18633f2f0535\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.844914 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-config-data\") pod \"3655b0f6-2e88-4b9e-b836-18633f2f0535\" (UID: \"3655b0f6-2e88-4b9e-b836-18633f2f0535\") " Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.864324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-scripts" (OuterVolumeSpecName: "scripts") pod "3655b0f6-2e88-4b9e-b836-18633f2f0535" (UID: "3655b0f6-2e88-4b9e-b836-18633f2f0535"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.865187 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3655b0f6-2e88-4b9e-b836-18633f2f0535-kube-api-access-92xgq" (OuterVolumeSpecName: "kube-api-access-92xgq") pod "3655b0f6-2e88-4b9e-b836-18633f2f0535" (UID: "3655b0f6-2e88-4b9e-b836-18633f2f0535"). InnerVolumeSpecName "kube-api-access-92xgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.886941 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3655b0f6-2e88-4b9e-b836-18633f2f0535" (UID: "3655b0f6-2e88-4b9e-b836-18633f2f0535"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.905878 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-config-data" (OuterVolumeSpecName: "config-data") pod "3655b0f6-2e88-4b9e-b836-18633f2f0535" (UID: "3655b0f6-2e88-4b9e-b836-18633f2f0535"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.946588 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92xgq\" (UniqueName: \"kubernetes.io/projected/3655b0f6-2e88-4b9e-b836-18633f2f0535-kube-api-access-92xgq\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.946625 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.946637 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:22 crc kubenswrapper[4773]: I1012 20:42:22.946649 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3655b0f6-2e88-4b9e-b836-18633f2f0535-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.429000 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kkjgn" event={"ID":"3655b0f6-2e88-4b9e-b836-18633f2f0535","Type":"ContainerDied","Data":"f77b498bbfd93c64fc76fad3878e6a2654b6a6f82eecf187466561632d15ec0d"} Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.429042 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f77b498bbfd93c64fc76fad3878e6a2654b6a6f82eecf187466561632d15ec0d" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.429102 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kkjgn" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.521710 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 12 20:42:23 crc kubenswrapper[4773]: E1012 20:42:23.522049 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3655b0f6-2e88-4b9e-b836-18633f2f0535" containerName="nova-cell0-conductor-db-sync" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.522064 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3655b0f6-2e88-4b9e-b836-18633f2f0535" containerName="nova-cell0-conductor-db-sync" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.522247 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3655b0f6-2e88-4b9e-b836-18633f2f0535" containerName="nova-cell0-conductor-db-sync" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.524212 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.544611 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.545255 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tcrrv" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.555259 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.561784 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95nl6\" (UniqueName: \"kubernetes.io/projected/6698b6b5-2a0c-45d1-a3dc-ea58147105dc-kube-api-access-95nl6\") pod \"nova-cell0-conductor-0\" (UID: \"6698b6b5-2a0c-45d1-a3dc-ea58147105dc\") " pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.561835 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6698b6b5-2a0c-45d1-a3dc-ea58147105dc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6698b6b5-2a0c-45d1-a3dc-ea58147105dc\") " pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.561872 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6698b6b5-2a0c-45d1-a3dc-ea58147105dc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6698b6b5-2a0c-45d1-a3dc-ea58147105dc\") " pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.662782 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95nl6\" (UniqueName: \"kubernetes.io/projected/6698b6b5-2a0c-45d1-a3dc-ea58147105dc-kube-api-access-95nl6\") pod \"nova-cell0-conductor-0\" (UID: \"6698b6b5-2a0c-45d1-a3dc-ea58147105dc\") " pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.663095 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6698b6b5-2a0c-45d1-a3dc-ea58147105dc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6698b6b5-2a0c-45d1-a3dc-ea58147105dc\") " pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.663189 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6698b6b5-2a0c-45d1-a3dc-ea58147105dc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6698b6b5-2a0c-45d1-a3dc-ea58147105dc\") " pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.666858 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6698b6b5-2a0c-45d1-a3dc-ea58147105dc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6698b6b5-2a0c-45d1-a3dc-ea58147105dc\") " pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.666891 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6698b6b5-2a0c-45d1-a3dc-ea58147105dc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6698b6b5-2a0c-45d1-a3dc-ea58147105dc\") " pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.686654 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95nl6\" (UniqueName: \"kubernetes.io/projected/6698b6b5-2a0c-45d1-a3dc-ea58147105dc-kube-api-access-95nl6\") pod \"nova-cell0-conductor-0\" (UID: \"6698b6b5-2a0c-45d1-a3dc-ea58147105dc\") " pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:23 crc kubenswrapper[4773]: I1012 20:42:23.845181 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:24 crc kubenswrapper[4773]: I1012 20:42:24.319989 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 12 20:42:24 crc kubenswrapper[4773]: I1012 20:42:24.436902 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6698b6b5-2a0c-45d1-a3dc-ea58147105dc","Type":"ContainerStarted","Data":"bb45473dfd55c2771d201d597fdc0ee8fc9aa540a42dd9e9dc49386e8c02fca2"} Oct 12 20:42:25 crc kubenswrapper[4773]: I1012 20:42:25.450604 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6698b6b5-2a0c-45d1-a3dc-ea58147105dc","Type":"ContainerStarted","Data":"631faf5e8f41c44ba74efedba8bb84f272da2b745bf1c85e87d0c3c4c10118c1"} Oct 12 20:42:25 crc kubenswrapper[4773]: I1012 20:42:25.451845 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:28 crc kubenswrapper[4773]: I1012 20:42:28.669202 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:42:28 crc kubenswrapper[4773]: I1012 20:42:28.669620 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:42:33 crc kubenswrapper[4773]: I1012 20:42:33.889381 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 12 20:42:33 crc kubenswrapper[4773]: I1012 20:42:33.917169 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=10.917142758 podStartE2EDuration="10.917142758s" podCreationTimestamp="2025-10-12 20:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:42:25.470932715 +0000 UTC m=+1093.707231285" watchObservedRunningTime="2025-10-12 20:42:33.917142758 +0000 UTC m=+1102.153441348" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.446731 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-h5qcs"] Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.447755 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.451276 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.451529 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.468810 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5qcs"] Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.568692 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.568777 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-scripts\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.568851 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrk62\" (UniqueName: \"kubernetes.io/projected/58145541-18b5-45d4-a92e-646ac7ef7961-kube-api-access-nrk62\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.568935 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-config-data\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.654169 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.655495 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.660264 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.670256 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrk62\" (UniqueName: \"kubernetes.io/projected/58145541-18b5-45d4-a92e-646ac7ef7961-kube-api-access-nrk62\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.670297 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-config-data\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.670351 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.670393 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-scripts\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.681170 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.681346 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-config-data\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.689121 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.690469 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.691050 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.699411 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-scripts\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.699691 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.709308 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrk62\" (UniqueName: \"kubernetes.io/projected/58145541-18b5-45d4-a92e-646ac7ef7961-kube-api-access-nrk62\") pod \"nova-cell0-cell-mapping-h5qcs\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.764608 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.765733 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.769209 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.769821 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.771459 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.771552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lszg\" (UniqueName: \"kubernetes.io/projected/14dfeb1c-832d-419c-a0ad-9bd775e4064a-kube-api-access-8lszg\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.771589 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14dfeb1c-832d-419c-a0ad-9bd775e4064a-logs\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.771623 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-config-data\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.807880 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.831869 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874586 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874639 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874692 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lln8t\" (UniqueName: \"kubernetes.io/projected/93fac107-ac20-4b2d-8960-e1f6b742198a-kube-api-access-lln8t\") pod \"nova-scheduler-0\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874760 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874779 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lszg\" (UniqueName: \"kubernetes.io/projected/14dfeb1c-832d-419c-a0ad-9bd775e4064a-kube-api-access-8lszg\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874811 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14dfeb1c-832d-419c-a0ad-9bd775e4064a-logs\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874828 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-config-data\") pod \"nova-scheduler-0\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874858 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874877 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-config-data\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.874904 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhf5q\" (UniqueName: \"kubernetes.io/projected/49792abe-3a04-4816-83ae-cdc737f4527d-kube-api-access-vhf5q\") pod \"nova-cell1-novncproxy-0\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.875554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14dfeb1c-832d-419c-a0ad-9bd775e4064a-logs\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.881053 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.895446 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-config-data\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.901574 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lszg\" (UniqueName: \"kubernetes.io/projected/14dfeb1c-832d-419c-a0ad-9bd775e4064a-kube-api-access-8lszg\") pod \"nova-api-0\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " pod="openstack/nova-api-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.956775 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.961019 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.971104 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 20:42:34 crc kubenswrapper[4773]: I1012 20:42:34.971383 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.030158 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040045 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhf5q\" (UniqueName: \"kubernetes.io/projected/49792abe-3a04-4816-83ae-cdc737f4527d-kube-api-access-vhf5q\") pod \"nova-cell1-novncproxy-0\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040166 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040232 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-config-data\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040276 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lln8t\" (UniqueName: \"kubernetes.io/projected/93fac107-ac20-4b2d-8960-e1f6b742198a-kube-api-access-lln8t\") pod \"nova-scheduler-0\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040360 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040388 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtgbz\" (UniqueName: \"kubernetes.io/projected/973b46bc-543c-4bad-b8b7-88530d24fa0d-kube-api-access-xtgbz\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973b46bc-543c-4bad-b8b7-88530d24fa0d-logs\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040473 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-config-data\") pod \"nova-scheduler-0\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.040519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.045361 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.113851 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhf5q\" (UniqueName: \"kubernetes.io/projected/49792abe-3a04-4816-83ae-cdc737f4527d-kube-api-access-vhf5q\") pod \"nova-cell1-novncproxy-0\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.115492 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75fb48c489-gzvml"] Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.119872 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-config-data\") pod \"nova-scheduler-0\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.120073 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.123599 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.131637 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.132740 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lln8t\" (UniqueName: \"kubernetes.io/projected/93fac107-ac20-4b2d-8960-e1f6b742198a-kube-api-access-lln8t\") pod \"nova-scheduler-0\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.138091 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75fb48c489-gzvml"] Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.141781 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973b46bc-543c-4bad-b8b7-88530d24fa0d-logs\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.142262 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973b46bc-543c-4bad-b8b7-88530d24fa0d-logs\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.142296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.142363 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-config-data\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.142953 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtgbz\" (UniqueName: \"kubernetes.io/projected/973b46bc-543c-4bad-b8b7-88530d24fa0d-kube-api-access-xtgbz\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.148204 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-config-data\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.161479 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.171402 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.173773 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtgbz\" (UniqueName: \"kubernetes.io/projected/973b46bc-543c-4bad-b8b7-88530d24fa0d-kube-api-access-xtgbz\") pod \"nova-metadata-0\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.244937 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-dns-svc\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.245276 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-sb\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.245330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-nb\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.245348 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bc4g\" (UniqueName: \"kubernetes.io/projected/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-kube-api-access-5bc4g\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.245428 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-config\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.346676 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-config\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.346768 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-dns-svc\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.346789 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-sb\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.346829 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-nb\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.346847 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bc4g\" (UniqueName: \"kubernetes.io/projected/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-kube-api-access-5bc4g\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.347802 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-config\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.347861 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-sb\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.348379 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-nb\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.349659 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-dns-svc\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.363311 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bc4g\" (UniqueName: \"kubernetes.io/projected/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-kube-api-access-5bc4g\") pod \"dnsmasq-dns-75fb48c489-gzvml\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.385975 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.394305 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.455939 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5qcs"] Oct 12 20:42:35 crc kubenswrapper[4773]: W1012 20:42:35.466034 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58145541_18b5_45d4_a92e_646ac7ef7961.slice/crio-994ed84de521c87cd18bdb8a68bc6607eb2ce3b0cf47b9940974ca2b268911ba WatchSource:0}: Error finding container 994ed84de521c87cd18bdb8a68bc6607eb2ce3b0cf47b9940974ca2b268911ba: Status 404 returned error can't find the container with id 994ed84de521c87cd18bdb8a68bc6607eb2ce3b0cf47b9940974ca2b268911ba Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.472093 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.611522 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5qcs" event={"ID":"58145541-18b5-45d4-a92e-646ac7ef7961","Type":"ContainerStarted","Data":"994ed84de521c87cd18bdb8a68bc6607eb2ce3b0cf47b9940974ca2b268911ba"} Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.615744 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:42:35 crc kubenswrapper[4773]: W1012 20:42:35.626857 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14dfeb1c_832d_419c_a0ad_9bd775e4064a.slice/crio-6a37ca843831ace50a5587da5dce79f5cf7319958aa91dc5e3ce94084c879836 WatchSource:0}: Error finding container 6a37ca843831ace50a5587da5dce79f5cf7319958aa91dc5e3ce94084c879836: Status 404 returned error can't find the container with id 6a37ca843831ace50a5587da5dce79f5cf7319958aa91dc5e3ce94084c879836 Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.745548 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 20:42:35 crc kubenswrapper[4773]: W1012 20:42:35.767610 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49792abe_3a04_4816_83ae_cdc737f4527d.slice/crio-374b6e3fc9e27dab47d2a336692a5efbc373deced01443d6e173fd9ba9d0c980 WatchSource:0}: Error finding container 374b6e3fc9e27dab47d2a336692a5efbc373deced01443d6e173fd9ba9d0c980: Status 404 returned error can't find the container with id 374b6e3fc9e27dab47d2a336692a5efbc373deced01443d6e173fd9ba9d0c980 Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.828482 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wghfz"] Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.830883 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.833246 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.833421 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.854180 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wghfz"] Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.872658 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn65l\" (UniqueName: \"kubernetes.io/projected/2367dd99-7d9e-412e-a065-a10039761c38-kube-api-access-dn65l\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.872893 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-config-data\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.872911 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.872946 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-scripts\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.977932 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-config-data\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.978283 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.978330 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-scripts\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.978387 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn65l\" (UniqueName: \"kubernetes.io/projected/2367dd99-7d9e-412e-a065-a10039761c38-kube-api-access-dn65l\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:35 crc kubenswrapper[4773]: I1012 20:42:35.992531 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.005233 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-scripts\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.005667 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn65l\" (UniqueName: \"kubernetes.io/projected/2367dd99-7d9e-412e-a065-a10039761c38-kube-api-access-dn65l\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.023391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-config-data\") pod \"nova-cell1-conductor-db-sync-wghfz\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.127574 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.170187 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.180609 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.237609 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75fb48c489-gzvml"] Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.625672 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"973b46bc-543c-4bad-b8b7-88530d24fa0d","Type":"ContainerStarted","Data":"9cfcbfa9e040b928a9b2e220b6178b68376196a58fcb7e8b9e739b9b21877933"} Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.635836 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"49792abe-3a04-4816-83ae-cdc737f4527d","Type":"ContainerStarted","Data":"374b6e3fc9e27dab47d2a336692a5efbc373deced01443d6e173fd9ba9d0c980"} Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.644767 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5qcs" event={"ID":"58145541-18b5-45d4-a92e-646ac7ef7961","Type":"ContainerStarted","Data":"79e2214c3d6be752386f2526678055c93984630918e896f4fc108121fe57e511"} Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.649085 4773 generic.go:334] "Generic (PLEG): container finished" podID="f9e044cb-617e-4470-bdd0-d7a28f2d2a63" containerID="122ee3050c4c2ac28f95a1769be0784635b524fb27bb3542123edb6b67ec2757" exitCode=0 Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.649553 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" event={"ID":"f9e044cb-617e-4470-bdd0-d7a28f2d2a63","Type":"ContainerDied","Data":"122ee3050c4c2ac28f95a1769be0784635b524fb27bb3542123edb6b67ec2757"} Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.649575 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" event={"ID":"f9e044cb-617e-4470-bdd0-d7a28f2d2a63","Type":"ContainerStarted","Data":"e6da95be093cfec1ebb2893ebffbd39eb9379ed806832dd73089216a2964bb13"} Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.661950 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14dfeb1c-832d-419c-a0ad-9bd775e4064a","Type":"ContainerStarted","Data":"6a37ca843831ace50a5587da5dce79f5cf7319958aa91dc5e3ce94084c879836"} Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.662487 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wghfz"] Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.663774 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"93fac107-ac20-4b2d-8960-e1f6b742198a","Type":"ContainerStarted","Data":"2e96909fc69979c8bff8a6a86070dc361409a2120f5ebc44811b8e7c03fb6b24"} Oct 12 20:42:36 crc kubenswrapper[4773]: I1012 20:42:36.689997 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-h5qcs" podStartSLOduration=2.689980286 podStartE2EDuration="2.689980286s" podCreationTimestamp="2025-10-12 20:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:42:36.6646002 +0000 UTC m=+1104.900898760" watchObservedRunningTime="2025-10-12 20:42:36.689980286 +0000 UTC m=+1104.926278846" Oct 12 20:42:37 crc kubenswrapper[4773]: I1012 20:42:37.683019 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wghfz" event={"ID":"2367dd99-7d9e-412e-a065-a10039761c38","Type":"ContainerStarted","Data":"a03922c4fbafa82c4fc6237c18b8c45736b72024be3ad3312fa40f232f9c2319"} Oct 12 20:42:37 crc kubenswrapper[4773]: I1012 20:42:37.683470 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wghfz" event={"ID":"2367dd99-7d9e-412e-a065-a10039761c38","Type":"ContainerStarted","Data":"20c9291ea685b41a2b1b4b4cdb65338aabc29f0d5e05a6069a74800045c9cef7"} Oct 12 20:42:37 crc kubenswrapper[4773]: I1012 20:42:37.685772 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" event={"ID":"f9e044cb-617e-4470-bdd0-d7a28f2d2a63","Type":"ContainerStarted","Data":"ed73eaf36fbee7b2df707c4dbee48e0f25dcd2138030a9fb86d027b6005457d4"} Oct 12 20:42:37 crc kubenswrapper[4773]: I1012 20:42:37.685919 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:37 crc kubenswrapper[4773]: I1012 20:42:37.723237 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" podStartSLOduration=2.7232215440000003 podStartE2EDuration="2.723221544s" podCreationTimestamp="2025-10-12 20:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:42:37.721006012 +0000 UTC m=+1105.957304572" watchObservedRunningTime="2025-10-12 20:42:37.723221544 +0000 UTC m=+1105.959520104" Oct 12 20:42:37 crc kubenswrapper[4773]: I1012 20:42:37.723547 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wghfz" podStartSLOduration=2.723541582 podStartE2EDuration="2.723541582s" podCreationTimestamp="2025-10-12 20:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:42:37.699333349 +0000 UTC m=+1105.935631919" watchObservedRunningTime="2025-10-12 20:42:37.723541582 +0000 UTC m=+1105.959840142" Oct 12 20:42:38 crc kubenswrapper[4773]: I1012 20:42:38.240868 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:38 crc kubenswrapper[4773]: I1012 20:42:38.253774 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.708990 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"49792abe-3a04-4816-83ae-cdc737f4527d","Type":"ContainerStarted","Data":"331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d"} Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.709371 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="49792abe-3a04-4816-83ae-cdc737f4527d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d" gracePeriod=30 Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.734764 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14dfeb1c-832d-419c-a0ad-9bd775e4064a","Type":"ContainerStarted","Data":"323170f182993454fd6e4cbfdf9e43a2604cea7956675e442cdc54b0d8fd526c"} Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.734818 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14dfeb1c-832d-419c-a0ad-9bd775e4064a","Type":"ContainerStarted","Data":"21723b4d6aece590887274664d79699da7bdde64d71a11fafef0646896b1ba85"} Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.745485 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.789451161 podStartE2EDuration="6.745465386s" podCreationTimestamp="2025-10-12 20:42:34 +0000 UTC" firstStartedPulling="2025-10-12 20:42:35.77268532 +0000 UTC m=+1104.008983880" lastFinishedPulling="2025-10-12 20:42:39.728699545 +0000 UTC m=+1107.964998105" observedRunningTime="2025-10-12 20:42:40.732700781 +0000 UTC m=+1108.968999361" watchObservedRunningTime="2025-10-12 20:42:40.745465386 +0000 UTC m=+1108.981763946" Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.762267 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"93fac107-ac20-4b2d-8960-e1f6b742198a","Type":"ContainerStarted","Data":"5188d08e9bb1deeaafb846a9ca98244b145fcb355b621b3d91c5112ce43c22a3"} Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.771341 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"973b46bc-543c-4bad-b8b7-88530d24fa0d","Type":"ContainerStarted","Data":"8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180"} Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.771390 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"973b46bc-543c-4bad-b8b7-88530d24fa0d","Type":"ContainerStarted","Data":"cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46"} Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.771502 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerName="nova-metadata-log" containerID="cri-o://cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46" gracePeriod=30 Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.771769 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerName="nova-metadata-metadata" containerID="cri-o://8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180" gracePeriod=30 Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.779128 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.682163448 podStartE2EDuration="6.779107192s" podCreationTimestamp="2025-10-12 20:42:34 +0000 UTC" firstStartedPulling="2025-10-12 20:42:35.637159452 +0000 UTC m=+1103.873458002" lastFinishedPulling="2025-10-12 20:42:39.734103186 +0000 UTC m=+1107.970401746" observedRunningTime="2025-10-12 20:42:40.771466679 +0000 UTC m=+1109.007765259" watchObservedRunningTime="2025-10-12 20:42:40.779107192 +0000 UTC m=+1109.015405752" Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.801572 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.183888109 podStartE2EDuration="6.801555026s" podCreationTimestamp="2025-10-12 20:42:34 +0000 UTC" firstStartedPulling="2025-10-12 20:42:36.113481366 +0000 UTC m=+1104.349779916" lastFinishedPulling="2025-10-12 20:42:39.731148273 +0000 UTC m=+1107.967446833" observedRunningTime="2025-10-12 20:42:40.791824125 +0000 UTC m=+1109.028122695" watchObservedRunningTime="2025-10-12 20:42:40.801555026 +0000 UTC m=+1109.037853586" Oct 12 20:42:40 crc kubenswrapper[4773]: I1012 20:42:40.839321 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.293886976 podStartE2EDuration="6.839290464s" podCreationTimestamp="2025-10-12 20:42:34 +0000 UTC" firstStartedPulling="2025-10-12 20:42:36.184870591 +0000 UTC m=+1104.421169151" lastFinishedPulling="2025-10-12 20:42:39.730274079 +0000 UTC m=+1107.966572639" observedRunningTime="2025-10-12 20:42:40.835750186 +0000 UTC m=+1109.072048756" watchObservedRunningTime="2025-10-12 20:42:40.839290464 +0000 UTC m=+1109.075589024" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.275160 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.461146 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-combined-ca-bundle\") pod \"973b46bc-543c-4bad-b8b7-88530d24fa0d\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.461238 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973b46bc-543c-4bad-b8b7-88530d24fa0d-logs\") pod \"973b46bc-543c-4bad-b8b7-88530d24fa0d\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.461265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-config-data\") pod \"973b46bc-543c-4bad-b8b7-88530d24fa0d\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.461387 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtgbz\" (UniqueName: \"kubernetes.io/projected/973b46bc-543c-4bad-b8b7-88530d24fa0d-kube-api-access-xtgbz\") pod \"973b46bc-543c-4bad-b8b7-88530d24fa0d\" (UID: \"973b46bc-543c-4bad-b8b7-88530d24fa0d\") " Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.462247 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/973b46bc-543c-4bad-b8b7-88530d24fa0d-logs" (OuterVolumeSpecName: "logs") pod "973b46bc-543c-4bad-b8b7-88530d24fa0d" (UID: "973b46bc-543c-4bad-b8b7-88530d24fa0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.467745 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973b46bc-543c-4bad-b8b7-88530d24fa0d-kube-api-access-xtgbz" (OuterVolumeSpecName: "kube-api-access-xtgbz") pod "973b46bc-543c-4bad-b8b7-88530d24fa0d" (UID: "973b46bc-543c-4bad-b8b7-88530d24fa0d"). InnerVolumeSpecName "kube-api-access-xtgbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.488774 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-config-data" (OuterVolumeSpecName: "config-data") pod "973b46bc-543c-4bad-b8b7-88530d24fa0d" (UID: "973b46bc-543c-4bad-b8b7-88530d24fa0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.492372 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "973b46bc-543c-4bad-b8b7-88530d24fa0d" (UID: "973b46bc-543c-4bad-b8b7-88530d24fa0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.566648 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.566695 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973b46bc-543c-4bad-b8b7-88530d24fa0d-logs\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.566735 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973b46bc-543c-4bad-b8b7-88530d24fa0d-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.566754 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtgbz\" (UniqueName: \"kubernetes.io/projected/973b46bc-543c-4bad-b8b7-88530d24fa0d-kube-api-access-xtgbz\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.782219 4773 generic.go:334] "Generic (PLEG): container finished" podID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerID="8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180" exitCode=0 Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.782265 4773 generic.go:334] "Generic (PLEG): container finished" podID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerID="cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46" exitCode=143 Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.783589 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.783819 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"973b46bc-543c-4bad-b8b7-88530d24fa0d","Type":"ContainerDied","Data":"8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180"} Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.783864 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"973b46bc-543c-4bad-b8b7-88530d24fa0d","Type":"ContainerDied","Data":"cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46"} Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.783884 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"973b46bc-543c-4bad-b8b7-88530d24fa0d","Type":"ContainerDied","Data":"9cfcbfa9e040b928a9b2e220b6178b68376196a58fcb7e8b9e739b9b21877933"} Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.783909 4773 scope.go:117] "RemoveContainer" containerID="8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.833191 4773 scope.go:117] "RemoveContainer" containerID="cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.846851 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.870031 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.873210 4773 scope.go:117] "RemoveContainer" containerID="8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180" Oct 12 20:42:41 crc kubenswrapper[4773]: E1012 20:42:41.875035 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180\": container with ID starting with 8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180 not found: ID does not exist" containerID="8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.875072 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180"} err="failed to get container status \"8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180\": rpc error: code = NotFound desc = could not find container \"8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180\": container with ID starting with 8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180 not found: ID does not exist" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.875098 4773 scope.go:117] "RemoveContainer" containerID="cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46" Oct 12 20:42:41 crc kubenswrapper[4773]: E1012 20:42:41.875329 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46\": container with ID starting with cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46 not found: ID does not exist" containerID="cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.875345 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46"} err="failed to get container status \"cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46\": rpc error: code = NotFound desc = could not find container \"cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46\": container with ID starting with cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46 not found: ID does not exist" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.875368 4773 scope.go:117] "RemoveContainer" containerID="8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.875863 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180"} err="failed to get container status \"8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180\": rpc error: code = NotFound desc = could not find container \"8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180\": container with ID starting with 8519c8eed0744503e334f6eb543653f67158f0d6a3da0ef744d8ddaea8e3f180 not found: ID does not exist" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.875879 4773 scope.go:117] "RemoveContainer" containerID="cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.876164 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46"} err="failed to get container status \"cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46\": rpc error: code = NotFound desc = could not find container \"cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46\": container with ID starting with cdc7f2024b6eea2242f756447db221d8995d9927d83bb32ccc4fe3a4a512bb46 not found: ID does not exist" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.884009 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:41 crc kubenswrapper[4773]: E1012 20:42:41.884472 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerName="nova-metadata-metadata" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.884495 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerName="nova-metadata-metadata" Oct 12 20:42:41 crc kubenswrapper[4773]: E1012 20:42:41.884525 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerName="nova-metadata-log" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.884533 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerName="nova-metadata-log" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.884688 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerName="nova-metadata-metadata" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.884737 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="973b46bc-543c-4bad-b8b7-88530d24fa0d" containerName="nova-metadata-log" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.885579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.895315 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.895503 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 12 20:42:41 crc kubenswrapper[4773]: I1012 20:42:41.909052 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.076100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-config-data\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.076151 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlvlt\" (UniqueName: \"kubernetes.io/projected/d4fca554-4ea5-44a6-870a-425f36b1c6f7-kube-api-access-qlvlt\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.076178 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4fca554-4ea5-44a6-870a-425f36b1c6f7-logs\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.076202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.076221 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.177664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-config-data\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.177822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlvlt\" (UniqueName: \"kubernetes.io/projected/d4fca554-4ea5-44a6-870a-425f36b1c6f7-kube-api-access-qlvlt\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.177854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4fca554-4ea5-44a6-870a-425f36b1c6f7-logs\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.177878 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.177898 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.179388 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4fca554-4ea5-44a6-870a-425f36b1c6f7-logs\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.182497 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.182650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-config-data\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.182800 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.198136 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlvlt\" (UniqueName: \"kubernetes.io/projected/d4fca554-4ea5-44a6-870a-425f36b1c6f7-kube-api-access-qlvlt\") pod \"nova-metadata-0\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.217200 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.498426 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973b46bc-543c-4bad-b8b7-88530d24fa0d" path="/var/lib/kubelet/pods/973b46bc-543c-4bad-b8b7-88530d24fa0d/volumes" Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.719827 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:42 crc kubenswrapper[4773]: I1012 20:42:42.825007 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4fca554-4ea5-44a6-870a-425f36b1c6f7","Type":"ContainerStarted","Data":"697b673108e0eab83b23aa101d272e91cee7259e764643aa46ca5e141b63a83e"} Oct 12 20:42:43 crc kubenswrapper[4773]: I1012 20:42:43.834944 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4fca554-4ea5-44a6-870a-425f36b1c6f7","Type":"ContainerStarted","Data":"ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f"} Oct 12 20:42:43 crc kubenswrapper[4773]: I1012 20:42:43.835521 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4fca554-4ea5-44a6-870a-425f36b1c6f7","Type":"ContainerStarted","Data":"c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006"} Oct 12 20:42:43 crc kubenswrapper[4773]: I1012 20:42:43.862753 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.86273228 podStartE2EDuration="2.86273228s" podCreationTimestamp="2025-10-12 20:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:42:43.856045064 +0000 UTC m=+1112.092343624" watchObservedRunningTime="2025-10-12 20:42:43.86273228 +0000 UTC m=+1112.099030830" Oct 12 20:42:44 crc kubenswrapper[4773]: I1012 20:42:44.852067 4773 generic.go:334] "Generic (PLEG): container finished" podID="58145541-18b5-45d4-a92e-646ac7ef7961" containerID="79e2214c3d6be752386f2526678055c93984630918e896f4fc108121fe57e511" exitCode=0 Oct 12 20:42:44 crc kubenswrapper[4773]: I1012 20:42:44.852123 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5qcs" event={"ID":"58145541-18b5-45d4-a92e-646ac7ef7961","Type":"ContainerDied","Data":"79e2214c3d6be752386f2526678055c93984630918e896f4fc108121fe57e511"} Oct 12 20:42:44 crc kubenswrapper[4773]: I1012 20:42:44.857049 4773 generic.go:334] "Generic (PLEG): container finished" podID="2367dd99-7d9e-412e-a065-a10039761c38" containerID="a03922c4fbafa82c4fc6237c18b8c45736b72024be3ad3312fa40f232f9c2319" exitCode=0 Oct 12 20:42:44 crc kubenswrapper[4773]: I1012 20:42:44.858543 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wghfz" event={"ID":"2367dd99-7d9e-412e-a065-a10039761c38","Type":"ContainerDied","Data":"a03922c4fbafa82c4fc6237c18b8c45736b72024be3ad3312fa40f232f9c2319"} Oct 12 20:42:44 crc kubenswrapper[4773]: I1012 20:42:44.972251 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 20:42:44 crc kubenswrapper[4773]: I1012 20:42:44.972701 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.172229 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.387287 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.387590 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.421603 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.474692 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.550553 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc"] Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.550836 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" podUID="f973dfa8-7c8a-47bb-9685-4b4e36d479e0" containerName="dnsmasq-dns" containerID="cri-o://1ba7a991f19b7a241f058973232f6ed5c46512aa8f75d96b65970a080497f406" gracePeriod=10 Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.871372 4773 generic.go:334] "Generic (PLEG): container finished" podID="f973dfa8-7c8a-47bb-9685-4b4e36d479e0" containerID="1ba7a991f19b7a241f058973232f6ed5c46512aa8f75d96b65970a080497f406" exitCode=0 Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.871562 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" event={"ID":"f973dfa8-7c8a-47bb-9685-4b4e36d479e0","Type":"ContainerDied","Data":"1ba7a991f19b7a241f058973232f6ed5c46512aa8f75d96b65970a080497f406"} Oct 12 20:42:45 crc kubenswrapper[4773]: I1012 20:42:45.934632 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.057003 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.168:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.057235 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.168:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.086455 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.255368 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-nb\") pod \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.255481 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-dns-svc\") pod \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.255706 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-sb\") pod \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.255959 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrmtf\" (UniqueName: \"kubernetes.io/projected/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-kube-api-access-hrmtf\") pod \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.256066 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-config\") pod \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\" (UID: \"f973dfa8-7c8a-47bb-9685-4b4e36d479e0\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.292291 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-kube-api-access-hrmtf" (OuterVolumeSpecName: "kube-api-access-hrmtf") pod "f973dfa8-7c8a-47bb-9685-4b4e36d479e0" (UID: "f973dfa8-7c8a-47bb-9685-4b4e36d479e0"). InnerVolumeSpecName "kube-api-access-hrmtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.358051 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrmtf\" (UniqueName: \"kubernetes.io/projected/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-kube-api-access-hrmtf\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.408324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f973dfa8-7c8a-47bb-9685-4b4e36d479e0" (UID: "f973dfa8-7c8a-47bb-9685-4b4e36d479e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.414027 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f973dfa8-7c8a-47bb-9685-4b4e36d479e0" (UID: "f973dfa8-7c8a-47bb-9685-4b4e36d479e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.414747 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.415331 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.443607 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-config" (OuterVolumeSpecName: "config") pod "f973dfa8-7c8a-47bb-9685-4b4e36d479e0" (UID: "f973dfa8-7c8a-47bb-9685-4b4e36d479e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.464269 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.464304 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.464316 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.467206 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f973dfa8-7c8a-47bb-9685-4b4e36d479e0" (UID: "f973dfa8-7c8a-47bb-9685-4b4e36d479e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.565649 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-scripts\") pod \"2367dd99-7d9e-412e-a065-a10039761c38\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.565955 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-scripts\") pod \"58145541-18b5-45d4-a92e-646ac7ef7961\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.566019 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn65l\" (UniqueName: \"kubernetes.io/projected/2367dd99-7d9e-412e-a065-a10039761c38-kube-api-access-dn65l\") pod \"2367dd99-7d9e-412e-a065-a10039761c38\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.566081 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-combined-ca-bundle\") pod \"58145541-18b5-45d4-a92e-646ac7ef7961\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.566111 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-config-data\") pod \"2367dd99-7d9e-412e-a065-a10039761c38\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.566181 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrk62\" (UniqueName: \"kubernetes.io/projected/58145541-18b5-45d4-a92e-646ac7ef7961-kube-api-access-nrk62\") pod \"58145541-18b5-45d4-a92e-646ac7ef7961\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.566256 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-combined-ca-bundle\") pod \"2367dd99-7d9e-412e-a065-a10039761c38\" (UID: \"2367dd99-7d9e-412e-a065-a10039761c38\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.566289 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-config-data\") pod \"58145541-18b5-45d4-a92e-646ac7ef7961\" (UID: \"58145541-18b5-45d4-a92e-646ac7ef7961\") " Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.566677 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f973dfa8-7c8a-47bb-9685-4b4e36d479e0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.568627 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-scripts" (OuterVolumeSpecName: "scripts") pod "2367dd99-7d9e-412e-a065-a10039761c38" (UID: "2367dd99-7d9e-412e-a065-a10039761c38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.570623 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2367dd99-7d9e-412e-a065-a10039761c38-kube-api-access-dn65l" (OuterVolumeSpecName: "kube-api-access-dn65l") pod "2367dd99-7d9e-412e-a065-a10039761c38" (UID: "2367dd99-7d9e-412e-a065-a10039761c38"). InnerVolumeSpecName "kube-api-access-dn65l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.571072 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58145541-18b5-45d4-a92e-646ac7ef7961-kube-api-access-nrk62" (OuterVolumeSpecName: "kube-api-access-nrk62") pod "58145541-18b5-45d4-a92e-646ac7ef7961" (UID: "58145541-18b5-45d4-a92e-646ac7ef7961"). InnerVolumeSpecName "kube-api-access-nrk62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.571410 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-scripts" (OuterVolumeSpecName: "scripts") pod "58145541-18b5-45d4-a92e-646ac7ef7961" (UID: "58145541-18b5-45d4-a92e-646ac7ef7961"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.593000 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-config-data" (OuterVolumeSpecName: "config-data") pod "58145541-18b5-45d4-a92e-646ac7ef7961" (UID: "58145541-18b5-45d4-a92e-646ac7ef7961"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.594406 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2367dd99-7d9e-412e-a065-a10039761c38" (UID: "2367dd99-7d9e-412e-a065-a10039761c38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.602920 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-config-data" (OuterVolumeSpecName: "config-data") pod "2367dd99-7d9e-412e-a065-a10039761c38" (UID: "2367dd99-7d9e-412e-a065-a10039761c38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.603240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58145541-18b5-45d4-a92e-646ac7ef7961" (UID: "58145541-18b5-45d4-a92e-646ac7ef7961"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.668891 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.669126 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrk62\" (UniqueName: \"kubernetes.io/projected/58145541-18b5-45d4-a92e-646ac7ef7961-kube-api-access-nrk62\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.669220 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.669282 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.669373 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2367dd99-7d9e-412e-a065-a10039761c38-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.669435 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.669487 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn65l\" (UniqueName: \"kubernetes.io/projected/2367dd99-7d9e-412e-a065-a10039761c38-kube-api-access-dn65l\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.669546 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58145541-18b5-45d4-a92e-646ac7ef7961-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.883121 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wghfz" event={"ID":"2367dd99-7d9e-412e-a065-a10039761c38","Type":"ContainerDied","Data":"20c9291ea685b41a2b1b4b4cdb65338aabc29f0d5e05a6069a74800045c9cef7"} Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.883340 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20c9291ea685b41a2b1b4b4cdb65338aabc29f0d5e05a6069a74800045c9cef7" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.883139 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wghfz" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.885673 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.885677 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc" event={"ID":"f973dfa8-7c8a-47bb-9685-4b4e36d479e0","Type":"ContainerDied","Data":"ce87a4b53c65cf58dab8487789f8bcf1e355b3b5b278492b8f6746c2735fdb19"} Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.885981 4773 scope.go:117] "RemoveContainer" containerID="1ba7a991f19b7a241f058973232f6ed5c46512aa8f75d96b65970a080497f406" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.902354 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5qcs" event={"ID":"58145541-18b5-45d4-a92e-646ac7ef7961","Type":"ContainerDied","Data":"994ed84de521c87cd18bdb8a68bc6607eb2ce3b0cf47b9940974ca2b268911ba"} Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.902519 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="994ed84de521c87cd18bdb8a68bc6607eb2ce3b0cf47b9940974ca2b268911ba" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.903356 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5qcs" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.917488 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc"] Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.931018 4773 scope.go:117] "RemoveContainer" containerID="b1bd1c14c551b4a3c5307519bc9506bc5c9bfadffa7f708e0d4dd031ba71d7e2" Oct 12 20:42:46 crc kubenswrapper[4773]: I1012 20:42:46.938679 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bdc9d6cdc-gsdxc"] Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.005242 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 12 20:42:47 crc kubenswrapper[4773]: E1012 20:42:47.012156 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f973dfa8-7c8a-47bb-9685-4b4e36d479e0" containerName="dnsmasq-dns" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.012364 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f973dfa8-7c8a-47bb-9685-4b4e36d479e0" containerName="dnsmasq-dns" Oct 12 20:42:47 crc kubenswrapper[4773]: E1012 20:42:47.012444 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f973dfa8-7c8a-47bb-9685-4b4e36d479e0" containerName="init" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.012495 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f973dfa8-7c8a-47bb-9685-4b4e36d479e0" containerName="init" Oct 12 20:42:47 crc kubenswrapper[4773]: E1012 20:42:47.012547 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58145541-18b5-45d4-a92e-646ac7ef7961" containerName="nova-manage" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.012595 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="58145541-18b5-45d4-a92e-646ac7ef7961" containerName="nova-manage" Oct 12 20:42:47 crc kubenswrapper[4773]: E1012 20:42:47.014021 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2367dd99-7d9e-412e-a065-a10039761c38" containerName="nova-cell1-conductor-db-sync" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.014091 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2367dd99-7d9e-412e-a065-a10039761c38" containerName="nova-cell1-conductor-db-sync" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.014321 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="58145541-18b5-45d4-a92e-646ac7ef7961" containerName="nova-manage" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.014410 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2367dd99-7d9e-412e-a065-a10039761c38" containerName="nova-cell1-conductor-db-sync" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.014466 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f973dfa8-7c8a-47bb-9685-4b4e36d479e0" containerName="dnsmasq-dns" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.015126 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.021580 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.022229 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.076873 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5838c3b9-38bc-4a97-bcbc-3a734b6b230f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5838c3b9-38bc-4a97-bcbc-3a734b6b230f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.077200 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5838c3b9-38bc-4a97-bcbc-3a734b6b230f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5838c3b9-38bc-4a97-bcbc-3a734b6b230f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.077314 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8t5b\" (UniqueName: \"kubernetes.io/projected/5838c3b9-38bc-4a97-bcbc-3a734b6b230f-kube-api-access-v8t5b\") pod \"nova-cell1-conductor-0\" (UID: \"5838c3b9-38bc-4a97-bcbc-3a734b6b230f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.159742 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.165710 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.165935 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-log" containerID="cri-o://21723b4d6aece590887274664d79699da7bdde64d71a11fafef0646896b1ba85" gracePeriod=30 Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.166008 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-api" containerID="cri-o://323170f182993454fd6e4cbfdf9e43a2604cea7956675e442cdc54b0d8fd526c" gracePeriod=30 Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.178595 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5838c3b9-38bc-4a97-bcbc-3a734b6b230f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5838c3b9-38bc-4a97-bcbc-3a734b6b230f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.178773 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8t5b\" (UniqueName: \"kubernetes.io/projected/5838c3b9-38bc-4a97-bcbc-3a734b6b230f-kube-api-access-v8t5b\") pod \"nova-cell1-conductor-0\" (UID: \"5838c3b9-38bc-4a97-bcbc-3a734b6b230f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.179145 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5838c3b9-38bc-4a97-bcbc-3a734b6b230f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5838c3b9-38bc-4a97-bcbc-3a734b6b230f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.183706 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5838c3b9-38bc-4a97-bcbc-3a734b6b230f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5838c3b9-38bc-4a97-bcbc-3a734b6b230f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.191487 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5838c3b9-38bc-4a97-bcbc-3a734b6b230f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5838c3b9-38bc-4a97-bcbc-3a734b6b230f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.199487 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8t5b\" (UniqueName: \"kubernetes.io/projected/5838c3b9-38bc-4a97-bcbc-3a734b6b230f-kube-api-access-v8t5b\") pod \"nova-cell1-conductor-0\" (UID: \"5838c3b9-38bc-4a97-bcbc-3a734b6b230f\") " pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.218862 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.219070 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.247145 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.331571 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.762153 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.777656 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.921454 4773 generic.go:334] "Generic (PLEG): container finished" podID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerID="21723b4d6aece590887274664d79699da7bdde64d71a11fafef0646896b1ba85" exitCode=143 Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.921512 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14dfeb1c-832d-419c-a0ad-9bd775e4064a","Type":"ContainerDied","Data":"21723b4d6aece590887274664d79699da7bdde64d71a11fafef0646896b1ba85"} Oct 12 20:42:47 crc kubenswrapper[4773]: I1012 20:42:47.926368 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5838c3b9-38bc-4a97-bcbc-3a734b6b230f","Type":"ContainerStarted","Data":"5ff1a930dcf7a55300bc7795d4d2a61007d815942f01a5e469d46123e85088fb"} Oct 12 20:42:48 crc kubenswrapper[4773]: I1012 20:42:48.490981 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f973dfa8-7c8a-47bb-9685-4b4e36d479e0" path="/var/lib/kubelet/pods/f973dfa8-7c8a-47bb-9685-4b4e36d479e0/volumes" Oct 12 20:42:48 crc kubenswrapper[4773]: I1012 20:42:48.943912 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5838c3b9-38bc-4a97-bcbc-3a734b6b230f","Type":"ContainerStarted","Data":"37c3522e6d4d76309609aeb26c38494f4618d44e85b78d32150e4c9d5fbb6843"} Oct 12 20:42:48 crc kubenswrapper[4773]: I1012 20:42:48.944409 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="93fac107-ac20-4b2d-8960-e1f6b742198a" containerName="nova-scheduler-scheduler" containerID="cri-o://5188d08e9bb1deeaafb846a9ca98244b145fcb355b621b3d91c5112ce43c22a3" gracePeriod=30 Oct 12 20:42:48 crc kubenswrapper[4773]: I1012 20:42:48.944529 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerName="nova-metadata-log" containerID="cri-o://c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006" gracePeriod=30 Oct 12 20:42:48 crc kubenswrapper[4773]: I1012 20:42:48.944627 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:48 crc kubenswrapper[4773]: I1012 20:42:48.944693 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerName="nova-metadata-metadata" containerID="cri-o://ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f" gracePeriod=30 Oct 12 20:42:48 crc kubenswrapper[4773]: I1012 20:42:48.972677 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.972659238 podStartE2EDuration="2.972659238s" podCreationTimestamp="2025-10-12 20:42:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:42:48.968143363 +0000 UTC m=+1117.204441933" watchObservedRunningTime="2025-10-12 20:42:48.972659238 +0000 UTC m=+1117.208957788" Oct 12 20:42:49 crc kubenswrapper[4773]: E1012 20:42:49.143022 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4fca554_4ea5_44a6_870a_425f36b1c6f7.slice/crio-c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4fca554_4ea5_44a6_870a_425f36b1c6f7.slice/crio-conmon-c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006.scope\": RecentStats: unable to find data in memory cache]" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.612089 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.727295 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-nova-metadata-tls-certs\") pod \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.727389 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlvlt\" (UniqueName: \"kubernetes.io/projected/d4fca554-4ea5-44a6-870a-425f36b1c6f7-kube-api-access-qlvlt\") pod \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.727497 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-config-data\") pod \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.727582 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-combined-ca-bundle\") pod \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.727626 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4fca554-4ea5-44a6-870a-425f36b1c6f7-logs\") pod \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\" (UID: \"d4fca554-4ea5-44a6-870a-425f36b1c6f7\") " Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.728170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4fca554-4ea5-44a6-870a-425f36b1c6f7-logs" (OuterVolumeSpecName: "logs") pod "d4fca554-4ea5-44a6-870a-425f36b1c6f7" (UID: "d4fca554-4ea5-44a6-870a-425f36b1c6f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.735261 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4fca554-4ea5-44a6-870a-425f36b1c6f7-kube-api-access-qlvlt" (OuterVolumeSpecName: "kube-api-access-qlvlt") pod "d4fca554-4ea5-44a6-870a-425f36b1c6f7" (UID: "d4fca554-4ea5-44a6-870a-425f36b1c6f7"). InnerVolumeSpecName "kube-api-access-qlvlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.756804 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4fca554-4ea5-44a6-870a-425f36b1c6f7" (UID: "d4fca554-4ea5-44a6-870a-425f36b1c6f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.756872 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-config-data" (OuterVolumeSpecName: "config-data") pod "d4fca554-4ea5-44a6-870a-425f36b1c6f7" (UID: "d4fca554-4ea5-44a6-870a-425f36b1c6f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.796616 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d4fca554-4ea5-44a6-870a-425f36b1c6f7" (UID: "d4fca554-4ea5-44a6-870a-425f36b1c6f7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.829963 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlvlt\" (UniqueName: \"kubernetes.io/projected/d4fca554-4ea5-44a6-870a-425f36b1c6f7-kube-api-access-qlvlt\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.829995 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.830004 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.830014 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4fca554-4ea5-44a6-870a-425f36b1c6f7-logs\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.830023 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4fca554-4ea5-44a6-870a-425f36b1c6f7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.954162 4773 generic.go:334] "Generic (PLEG): container finished" podID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerID="ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f" exitCode=0 Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.954197 4773 generic.go:334] "Generic (PLEG): container finished" podID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerID="c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006" exitCode=143 Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.954221 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.954242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4fca554-4ea5-44a6-870a-425f36b1c6f7","Type":"ContainerDied","Data":"ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f"} Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.954305 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4fca554-4ea5-44a6-870a-425f36b1c6f7","Type":"ContainerDied","Data":"c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006"} Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.954315 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4fca554-4ea5-44a6-870a-425f36b1c6f7","Type":"ContainerDied","Data":"697b673108e0eab83b23aa101d272e91cee7259e764643aa46ca5e141b63a83e"} Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.954332 4773 scope.go:117] "RemoveContainer" containerID="ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.988956 4773 scope.go:117] "RemoveContainer" containerID="c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006" Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.990942 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:49 crc kubenswrapper[4773]: I1012 20:42:49.999835 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.023343 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:50 crc kubenswrapper[4773]: E1012 20:42:50.023769 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerName="nova-metadata-metadata" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.023785 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerName="nova-metadata-metadata" Oct 12 20:42:50 crc kubenswrapper[4773]: E1012 20:42:50.023806 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerName="nova-metadata-log" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.023812 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerName="nova-metadata-log" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.023985 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerName="nova-metadata-metadata" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.024002 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" containerName="nova-metadata-log" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.024954 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.030554 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.032794 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.033526 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.037988 4773 scope.go:117] "RemoveContainer" containerID="ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f" Oct 12 20:42:50 crc kubenswrapper[4773]: E1012 20:42:50.038496 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f\": container with ID starting with ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f not found: ID does not exist" containerID="ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.038559 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f"} err="failed to get container status \"ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f\": rpc error: code = NotFound desc = could not find container \"ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f\": container with ID starting with ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f not found: ID does not exist" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.038584 4773 scope.go:117] "RemoveContainer" containerID="c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006" Oct 12 20:42:50 crc kubenswrapper[4773]: E1012 20:42:50.038878 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006\": container with ID starting with c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006 not found: ID does not exist" containerID="c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.038931 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006"} err="failed to get container status \"c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006\": rpc error: code = NotFound desc = could not find container \"c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006\": container with ID starting with c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006 not found: ID does not exist" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.038947 4773 scope.go:117] "RemoveContainer" containerID="ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.039313 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f"} err="failed to get container status \"ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f\": rpc error: code = NotFound desc = could not find container \"ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f\": container with ID starting with ff36656309e813e655a087882fb82c92b42db76bd1233b3bd368d159a65f096f not found: ID does not exist" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.039348 4773 scope.go:117] "RemoveContainer" containerID="c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.041737 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006"} err="failed to get container status \"c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006\": rpc error: code = NotFound desc = could not find container \"c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006\": container with ID starting with c24d8486073e6f657199506cd5918994b0a528a869bf177782f3c860038e1006 not found: ID does not exist" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.137605 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.137688 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-config-data\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.137890 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j884p\" (UniqueName: \"kubernetes.io/projected/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-kube-api-access-j884p\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.138082 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-logs\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.138126 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.239514 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-logs\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.239575 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.239632 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.239678 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-config-data\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.239744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j884p\" (UniqueName: \"kubernetes.io/projected/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-kube-api-access-j884p\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.240029 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-logs\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.248210 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-config-data\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.257967 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.265425 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.293259 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j884p\" (UniqueName: \"kubernetes.io/projected/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-kube-api-access-j884p\") pod \"nova-metadata-0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.351209 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:42:50 crc kubenswrapper[4773]: E1012 20:42:50.397844 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5188d08e9bb1deeaafb846a9ca98244b145fcb355b621b3d91c5112ce43c22a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 12 20:42:50 crc kubenswrapper[4773]: E1012 20:42:50.413683 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5188d08e9bb1deeaafb846a9ca98244b145fcb355b621b3d91c5112ce43c22a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 12 20:42:50 crc kubenswrapper[4773]: E1012 20:42:50.417591 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5188d08e9bb1deeaafb846a9ca98244b145fcb355b621b3d91c5112ce43c22a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 12 20:42:50 crc kubenswrapper[4773]: E1012 20:42:50.417644 4773 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="93fac107-ac20-4b2d-8960-e1f6b742198a" containerName="nova-scheduler-scheduler" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.496361 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4fca554-4ea5-44a6-870a-425f36b1c6f7" path="/var/lib/kubelet/pods/d4fca554-4ea5-44a6-870a-425f36b1c6f7/volumes" Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.819096 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.819574 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3d9d2321-06c1-42e1-b4a9-d6a759d0eba0" containerName="kube-state-metrics" containerID="cri-o://91b6508dc536379c0a60353a79495e377d6674d08f921506b2aaa22f3d67246c" gracePeriod=30 Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.953010 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.975917 4773 generic.go:334] "Generic (PLEG): container finished" podID="3d9d2321-06c1-42e1-b4a9-d6a759d0eba0" containerID="91b6508dc536379c0a60353a79495e377d6674d08f921506b2aaa22f3d67246c" exitCode=2 Oct 12 20:42:50 crc kubenswrapper[4773]: I1012 20:42:50.975958 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d9d2321-06c1-42e1-b4a9-d6a759d0eba0","Type":"ContainerDied","Data":"91b6508dc536379c0a60353a79495e377d6674d08f921506b2aaa22f3d67246c"} Oct 12 20:42:51 crc kubenswrapper[4773]: I1012 20:42:51.396108 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 20:42:51 crc kubenswrapper[4773]: I1012 20:42:51.464654 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hd9h\" (UniqueName: \"kubernetes.io/projected/3d9d2321-06c1-42e1-b4a9-d6a759d0eba0-kube-api-access-7hd9h\") pod \"3d9d2321-06c1-42e1-b4a9-d6a759d0eba0\" (UID: \"3d9d2321-06c1-42e1-b4a9-d6a759d0eba0\") " Oct 12 20:42:51 crc kubenswrapper[4773]: I1012 20:42:51.480036 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9d2321-06c1-42e1-b4a9-d6a759d0eba0-kube-api-access-7hd9h" (OuterVolumeSpecName: "kube-api-access-7hd9h") pod "3d9d2321-06c1-42e1-b4a9-d6a759d0eba0" (UID: "3d9d2321-06c1-42e1-b4a9-d6a759d0eba0"). InnerVolumeSpecName "kube-api-access-7hd9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:51 crc kubenswrapper[4773]: I1012 20:42:51.568950 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hd9h\" (UniqueName: \"kubernetes.io/projected/3d9d2321-06c1-42e1-b4a9-d6a759d0eba0-kube-api-access-7hd9h\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:51 crc kubenswrapper[4773]: I1012 20:42:51.989998 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 20:42:51 crc kubenswrapper[4773]: I1012 20:42:51.990079 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3d9d2321-06c1-42e1-b4a9-d6a759d0eba0","Type":"ContainerDied","Data":"5a759c1a95e42252828f3e19421e3b56cc70ebf51f63f3740c67b1bddfc4d83e"} Oct 12 20:42:51 crc kubenswrapper[4773]: I1012 20:42:51.990130 4773 scope.go:117] "RemoveContainer" containerID="91b6508dc536379c0a60353a79495e377d6674d08f921506b2aaa22f3d67246c" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:51.999068 4773 generic.go:334] "Generic (PLEG): container finished" podID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerID="323170f182993454fd6e4cbfdf9e43a2604cea7956675e442cdc54b0d8fd526c" exitCode=0 Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:51.999137 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14dfeb1c-832d-419c-a0ad-9bd775e4064a","Type":"ContainerDied","Data":"323170f182993454fd6e4cbfdf9e43a2604cea7956675e442cdc54b0d8fd526c"} Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.001499 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.004820 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0","Type":"ContainerStarted","Data":"33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca"} Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.004854 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0","Type":"ContainerStarted","Data":"a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060"} Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.004876 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0","Type":"ContainerStarted","Data":"54d5c46617d43798492c30bc9fc62a258eee5533e4781d5bfe52a0e8b2bb9a1f"} Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.013811 4773 generic.go:334] "Generic (PLEG): container finished" podID="93fac107-ac20-4b2d-8960-e1f6b742198a" containerID="5188d08e9bb1deeaafb846a9ca98244b145fcb355b621b3d91c5112ce43c22a3" exitCode=0 Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.013992 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"93fac107-ac20-4b2d-8960-e1f6b742198a","Type":"ContainerDied","Data":"5188d08e9bb1deeaafb846a9ca98244b145fcb355b621b3d91c5112ce43c22a3"} Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.075724 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.075694726 podStartE2EDuration="3.075694726s" podCreationTimestamp="2025-10-12 20:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:42:52.04453353 +0000 UTC m=+1120.280832090" watchObservedRunningTime="2025-10-12 20:42:52.075694726 +0000 UTC m=+1120.311993286" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.080961 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-config-data\") pod \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.081128 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lszg\" (UniqueName: \"kubernetes.io/projected/14dfeb1c-832d-419c-a0ad-9bd775e4064a-kube-api-access-8lszg\") pod \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.081152 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-combined-ca-bundle\") pod \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.081196 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14dfeb1c-832d-419c-a0ad-9bd775e4064a-logs\") pod \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\" (UID: \"14dfeb1c-832d-419c-a0ad-9bd775e4064a\") " Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.111064 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14dfeb1c-832d-419c-a0ad-9bd775e4064a-logs" (OuterVolumeSpecName: "logs") pod "14dfeb1c-832d-419c-a0ad-9bd775e4064a" (UID: "14dfeb1c-832d-419c-a0ad-9bd775e4064a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.117902 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dfeb1c-832d-419c-a0ad-9bd775e4064a-kube-api-access-8lszg" (OuterVolumeSpecName: "kube-api-access-8lszg") pod "14dfeb1c-832d-419c-a0ad-9bd775e4064a" (UID: "14dfeb1c-832d-419c-a0ad-9bd775e4064a"). InnerVolumeSpecName "kube-api-access-8lszg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.124768 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.136407 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.143912 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 20:42:52 crc kubenswrapper[4773]: E1012 20:42:52.144327 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-log" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.144346 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-log" Oct 12 20:42:52 crc kubenswrapper[4773]: E1012 20:42:52.144359 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-api" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.144367 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-api" Oct 12 20:42:52 crc kubenswrapper[4773]: E1012 20:42:52.144381 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9d2321-06c1-42e1-b4a9-d6a759d0eba0" containerName="kube-state-metrics" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.144389 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9d2321-06c1-42e1-b4a9-d6a759d0eba0" containerName="kube-state-metrics" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.144566 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9d2321-06c1-42e1-b4a9-d6a759d0eba0" containerName="kube-state-metrics" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.144582 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-log" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.144590 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" containerName="nova-api-api" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.145235 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.149124 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.149634 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.167924 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.192033 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lszg\" (UniqueName: \"kubernetes.io/projected/14dfeb1c-832d-419c-a0ad-9bd775e4064a-kube-api-access-8lszg\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.192383 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14dfeb1c-832d-419c-a0ad-9bd775e4064a-logs\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.202797 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-config-data" (OuterVolumeSpecName: "config-data") pod "14dfeb1c-832d-419c-a0ad-9bd775e4064a" (UID: "14dfeb1c-832d-419c-a0ad-9bd775e4064a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.226448 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.226705 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="ceilometer-central-agent" containerID="cri-o://93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e" gracePeriod=30 Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.226889 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="proxy-httpd" containerID="cri-o://1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d" gracePeriod=30 Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.226935 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="sg-core" containerID="cri-o://9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e" gracePeriod=30 Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.227003 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="ceilometer-notification-agent" containerID="cri-o://089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3" gracePeriod=30 Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.227699 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14dfeb1c-832d-419c-a0ad-9bd775e4064a" (UID: "14dfeb1c-832d-419c-a0ad-9bd775e4064a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.292228 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.293168 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c04ed55-a588-4a57-9f14-90fca8e2dab0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.293289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5c04ed55-a588-4a57-9f14-90fca8e2dab0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.293415 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kbfl\" (UniqueName: \"kubernetes.io/projected/5c04ed55-a588-4a57-9f14-90fca8e2dab0-kube-api-access-9kbfl\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.293578 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c04ed55-a588-4a57-9f14-90fca8e2dab0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.293750 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.293773 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14dfeb1c-832d-419c-a0ad-9bd775e4064a-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.370334 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.394456 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lln8t\" (UniqueName: \"kubernetes.io/projected/93fac107-ac20-4b2d-8960-e1f6b742198a-kube-api-access-lln8t\") pod \"93fac107-ac20-4b2d-8960-e1f6b742198a\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.394526 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-config-data\") pod \"93fac107-ac20-4b2d-8960-e1f6b742198a\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.394548 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-combined-ca-bundle\") pod \"93fac107-ac20-4b2d-8960-e1f6b742198a\" (UID: \"93fac107-ac20-4b2d-8960-e1f6b742198a\") " Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.394847 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5c04ed55-a588-4a57-9f14-90fca8e2dab0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.394936 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kbfl\" (UniqueName: \"kubernetes.io/projected/5c04ed55-a588-4a57-9f14-90fca8e2dab0-kube-api-access-9kbfl\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.394988 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c04ed55-a588-4a57-9f14-90fca8e2dab0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.395034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c04ed55-a588-4a57-9f14-90fca8e2dab0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.398661 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fac107-ac20-4b2d-8960-e1f6b742198a-kube-api-access-lln8t" (OuterVolumeSpecName: "kube-api-access-lln8t") pod "93fac107-ac20-4b2d-8960-e1f6b742198a" (UID: "93fac107-ac20-4b2d-8960-e1f6b742198a"). InnerVolumeSpecName "kube-api-access-lln8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.398752 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c04ed55-a588-4a57-9f14-90fca8e2dab0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.410705 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5c04ed55-a588-4a57-9f14-90fca8e2dab0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.412405 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c04ed55-a588-4a57-9f14-90fca8e2dab0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.421933 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93fac107-ac20-4b2d-8960-e1f6b742198a" (UID: "93fac107-ac20-4b2d-8960-e1f6b742198a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.422345 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kbfl\" (UniqueName: \"kubernetes.io/projected/5c04ed55-a588-4a57-9f14-90fca8e2dab0-kube-api-access-9kbfl\") pod \"kube-state-metrics-0\" (UID: \"5c04ed55-a588-4a57-9f14-90fca8e2dab0\") " pod="openstack/kube-state-metrics-0" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.423097 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-config-data" (OuterVolumeSpecName: "config-data") pod "93fac107-ac20-4b2d-8960-e1f6b742198a" (UID: "93fac107-ac20-4b2d-8960-e1f6b742198a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.489253 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d9d2321-06c1-42e1-b4a9-d6a759d0eba0" path="/var/lib/kubelet/pods/3d9d2321-06c1-42e1-b4a9-d6a759d0eba0/volumes" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.496317 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lln8t\" (UniqueName: \"kubernetes.io/projected/93fac107-ac20-4b2d-8960-e1f6b742198a-kube-api-access-lln8t\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.496343 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.496352 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fac107-ac20-4b2d-8960-e1f6b742198a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:52 crc kubenswrapper[4773]: I1012 20:42:52.590862 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.024074 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14dfeb1c-832d-419c-a0ad-9bd775e4064a","Type":"ContainerDied","Data":"6a37ca843831ace50a5587da5dce79f5cf7319958aa91dc5e3ce94084c879836"} Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.024445 4773 scope.go:117] "RemoveContainer" containerID="323170f182993454fd6e4cbfdf9e43a2604cea7956675e442cdc54b0d8fd526c" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.024088 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.027624 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"93fac107-ac20-4b2d-8960-e1f6b742198a","Type":"ContainerDied","Data":"2e96909fc69979c8bff8a6a86070dc361409a2120f5ebc44811b8e7c03fb6b24"} Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.027654 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.045677 4773 generic.go:334] "Generic (PLEG): container finished" podID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerID="1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d" exitCode=0 Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.045703 4773 generic.go:334] "Generic (PLEG): container finished" podID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerID="9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e" exitCode=2 Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.045728 4773 generic.go:334] "Generic (PLEG): container finished" podID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerID="93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e" exitCode=0 Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.045879 4773 scope.go:117] "RemoveContainer" containerID="21723b4d6aece590887274664d79699da7bdde64d71a11fafef0646896b1ba85" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.046003 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerDied","Data":"1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d"} Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.046039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerDied","Data":"9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e"} Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.046053 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerDied","Data":"93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e"} Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.062109 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 20:42:53 crc kubenswrapper[4773]: W1012 20:42:53.068350 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c04ed55_a588_4a57_9f14_90fca8e2dab0.slice/crio-9d94ce7ca48ca654b953778eebd102fff618f0198ae2902a9f559fc466b7ca68 WatchSource:0}: Error finding container 9d94ce7ca48ca654b953778eebd102fff618f0198ae2902a9f559fc466b7ca68: Status 404 returned error can't find the container with id 9d94ce7ca48ca654b953778eebd102fff618f0198ae2902a9f559fc466b7ca68 Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.108997 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.135498 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.162502 4773 scope.go:117] "RemoveContainer" containerID="5188d08e9bb1deeaafb846a9ca98244b145fcb355b621b3d91c5112ce43c22a3" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.163895 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.181754 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 20:42:53 crc kubenswrapper[4773]: E1012 20:42:53.182348 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fac107-ac20-4b2d-8960-e1f6b742198a" containerName="nova-scheduler-scheduler" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.182414 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fac107-ac20-4b2d-8960-e1f6b742198a" containerName="nova-scheduler-scheduler" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.182639 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fac107-ac20-4b2d-8960-e1f6b742198a" containerName="nova-scheduler-scheduler" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.185134 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.188951 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.202114 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.215888 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.217043 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.220386 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.228530 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.240035 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.315158 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-config-data\") pod \"nova-scheduler-0\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.315211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p56mv\" (UniqueName: \"kubernetes.io/projected/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-kube-api-access-p56mv\") pod \"nova-scheduler-0\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.315409 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsf86\" (UniqueName: \"kubernetes.io/projected/40b84df2-e013-45e9-8487-00e3147f0a2b-kube-api-access-lsf86\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.315602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.315803 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.315829 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-config-data\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.315899 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b84df2-e013-45e9-8487-00e3147f0a2b-logs\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.417734 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.417774 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-config-data\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.417804 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b84df2-e013-45e9-8487-00e3147f0a2b-logs\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.417848 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-config-data\") pod \"nova-scheduler-0\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.417869 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p56mv\" (UniqueName: \"kubernetes.io/projected/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-kube-api-access-p56mv\") pod \"nova-scheduler-0\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.417910 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsf86\" (UniqueName: \"kubernetes.io/projected/40b84df2-e013-45e9-8487-00e3147f0a2b-kube-api-access-lsf86\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.417954 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.420097 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b84df2-e013-45e9-8487-00e3147f0a2b-logs\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.424542 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-config-data\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.424778 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.425479 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.436837 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-config-data\") pod \"nova-scheduler-0\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.439894 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsf86\" (UniqueName: \"kubernetes.io/projected/40b84df2-e013-45e9-8487-00e3147f0a2b-kube-api-access-lsf86\") pod \"nova-api-0\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.440648 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p56mv\" (UniqueName: \"kubernetes.io/projected/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-kube-api-access-p56mv\") pod \"nova-scheduler-0\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " pod="openstack/nova-scheduler-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.513398 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:42:53 crc kubenswrapper[4773]: I1012 20:42:53.536092 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:42:54 crc kubenswrapper[4773]: I1012 20:42:54.002665 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:42:54 crc kubenswrapper[4773]: I1012 20:42:54.092038 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b84df2-e013-45e9-8487-00e3147f0a2b","Type":"ContainerStarted","Data":"c6aab2af9f95152cefe7ab6e3142b80997ea327400c54daf13281007d98ad00a"} Oct 12 20:42:54 crc kubenswrapper[4773]: I1012 20:42:54.097678 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c04ed55-a588-4a57-9f14-90fca8e2dab0","Type":"ContainerStarted","Data":"d2f2c1ee96eb72d2cab40b221c7f9fcc9e69e376f60173273569b0eb01987736"} Oct 12 20:42:54 crc kubenswrapper[4773]: I1012 20:42:54.097732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c04ed55-a588-4a57-9f14-90fca8e2dab0","Type":"ContainerStarted","Data":"9d94ce7ca48ca654b953778eebd102fff618f0198ae2902a9f559fc466b7ca68"} Oct 12 20:42:54 crc kubenswrapper[4773]: I1012 20:42:54.097923 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 12 20:42:54 crc kubenswrapper[4773]: I1012 20:42:54.150847 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.767862697 podStartE2EDuration="2.150827425s" podCreationTimestamp="2025-10-12 20:42:52 +0000 UTC" firstStartedPulling="2025-10-12 20:42:53.070652311 +0000 UTC m=+1121.306950861" lastFinishedPulling="2025-10-12 20:42:53.453617029 +0000 UTC m=+1121.689915589" observedRunningTime="2025-10-12 20:42:54.127927679 +0000 UTC m=+1122.364226239" watchObservedRunningTime="2025-10-12 20:42:54.150827425 +0000 UTC m=+1122.387125985" Oct 12 20:42:54 crc kubenswrapper[4773]: I1012 20:42:54.151267 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:42:54 crc kubenswrapper[4773]: I1012 20:42:54.499411 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14dfeb1c-832d-419c-a0ad-9bd775e4064a" path="/var/lib/kubelet/pods/14dfeb1c-832d-419c-a0ad-9bd775e4064a/volumes" Oct 12 20:42:54 crc kubenswrapper[4773]: I1012 20:42:54.504634 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fac107-ac20-4b2d-8960-e1f6b742198a" path="/var/lib/kubelet/pods/93fac107-ac20-4b2d-8960-e1f6b742198a/volumes" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.105938 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec48c603-e42d-4f2d-b884-48d90cfeb1f6","Type":"ContainerStarted","Data":"6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0"} Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.105985 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec48c603-e42d-4f2d-b884-48d90cfeb1f6","Type":"ContainerStarted","Data":"8c7748f9ef796efdaef25d11952ad2ac8c63d7391e5cd899277b68e2efceb932"} Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.108596 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b84df2-e013-45e9-8487-00e3147f0a2b","Type":"ContainerStarted","Data":"8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f"} Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.108666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b84df2-e013-45e9-8487-00e3147f0a2b","Type":"ContainerStarted","Data":"c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae"} Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.129442 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.129424915 podStartE2EDuration="2.129424915s" podCreationTimestamp="2025-10-12 20:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:42:55.125329431 +0000 UTC m=+1123.361627991" watchObservedRunningTime="2025-10-12 20:42:55.129424915 +0000 UTC m=+1123.365723475" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.142733 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.142705753 podStartE2EDuration="2.142705753s" podCreationTimestamp="2025-10-12 20:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:42:55.141575312 +0000 UTC m=+1123.377873862" watchObservedRunningTime="2025-10-12 20:42:55.142705753 +0000 UTC m=+1123.379004313" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.351980 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.352238 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.678857 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.766304 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-combined-ca-bundle\") pod \"121ee448-a59b-45ec-9e07-69f8d4e22518\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.766425 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55575\" (UniqueName: \"kubernetes.io/projected/121ee448-a59b-45ec-9e07-69f8d4e22518-kube-api-access-55575\") pod \"121ee448-a59b-45ec-9e07-69f8d4e22518\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.766462 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-sg-core-conf-yaml\") pod \"121ee448-a59b-45ec-9e07-69f8d4e22518\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.766515 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-run-httpd\") pod \"121ee448-a59b-45ec-9e07-69f8d4e22518\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.766573 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-config-data\") pod \"121ee448-a59b-45ec-9e07-69f8d4e22518\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.766614 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-scripts\") pod \"121ee448-a59b-45ec-9e07-69f8d4e22518\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.766669 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-log-httpd\") pod \"121ee448-a59b-45ec-9e07-69f8d4e22518\" (UID: \"121ee448-a59b-45ec-9e07-69f8d4e22518\") " Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.767289 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "121ee448-a59b-45ec-9e07-69f8d4e22518" (UID: "121ee448-a59b-45ec-9e07-69f8d4e22518"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.768056 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "121ee448-a59b-45ec-9e07-69f8d4e22518" (UID: "121ee448-a59b-45ec-9e07-69f8d4e22518"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.776700 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/121ee448-a59b-45ec-9e07-69f8d4e22518-kube-api-access-55575" (OuterVolumeSpecName: "kube-api-access-55575") pod "121ee448-a59b-45ec-9e07-69f8d4e22518" (UID: "121ee448-a59b-45ec-9e07-69f8d4e22518"). InnerVolumeSpecName "kube-api-access-55575". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.782835 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-scripts" (OuterVolumeSpecName: "scripts") pod "121ee448-a59b-45ec-9e07-69f8d4e22518" (UID: "121ee448-a59b-45ec-9e07-69f8d4e22518"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.809198 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "121ee448-a59b-45ec-9e07-69f8d4e22518" (UID: "121ee448-a59b-45ec-9e07-69f8d4e22518"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.868841 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.868870 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.868882 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55575\" (UniqueName: \"kubernetes.io/projected/121ee448-a59b-45ec-9e07-69f8d4e22518-kube-api-access-55575\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.868891 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.868900 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/121ee448-a59b-45ec-9e07-69f8d4e22518-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.888317 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "121ee448-a59b-45ec-9e07-69f8d4e22518" (UID: "121ee448-a59b-45ec-9e07-69f8d4e22518"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.893407 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-config-data" (OuterVolumeSpecName: "config-data") pod "121ee448-a59b-45ec-9e07-69f8d4e22518" (UID: "121ee448-a59b-45ec-9e07-69f8d4e22518"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.970916 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:55 crc kubenswrapper[4773]: I1012 20:42:55.970946 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/121ee448-a59b-45ec-9e07-69f8d4e22518-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.117250 4773 generic.go:334] "Generic (PLEG): container finished" podID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerID="089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3" exitCode=0 Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.117350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerDied","Data":"089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3"} Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.117406 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"121ee448-a59b-45ec-9e07-69f8d4e22518","Type":"ContainerDied","Data":"14fbf742d25ec4d19f9b56056ddb47a325d22b9a1b0a66a426c9e218ec17302d"} Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.117427 4773 scope.go:117] "RemoveContainer" containerID="1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.118455 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.167698 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.174636 4773 scope.go:117] "RemoveContainer" containerID="9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.184856 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.193079 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:56 crc kubenswrapper[4773]: E1012 20:42:56.193428 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="sg-core" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.193446 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="sg-core" Oct 12 20:42:56 crc kubenswrapper[4773]: E1012 20:42:56.193462 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="ceilometer-notification-agent" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.193470 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="ceilometer-notification-agent" Oct 12 20:42:56 crc kubenswrapper[4773]: E1012 20:42:56.193482 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="ceilometer-central-agent" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.193489 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="ceilometer-central-agent" Oct 12 20:42:56 crc kubenswrapper[4773]: E1012 20:42:56.193510 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="proxy-httpd" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.193516 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="proxy-httpd" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.193666 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="sg-core" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.193681 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="proxy-httpd" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.193689 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="ceilometer-central-agent" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.193700 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" containerName="ceilometer-notification-agent" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.195158 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.199128 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.199276 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.199297 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.205601 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.211010 4773 scope.go:117] "RemoveContainer" containerID="089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.265752 4773 scope.go:117] "RemoveContainer" containerID="93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.276665 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.276707 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.276809 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-config-data\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.276847 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-run-httpd\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.276968 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-scripts\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.277045 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjmdr\" (UniqueName: \"kubernetes.io/projected/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-kube-api-access-bjmdr\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.277085 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.277172 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-log-httpd\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.289475 4773 scope.go:117] "RemoveContainer" containerID="1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d" Oct 12 20:42:56 crc kubenswrapper[4773]: E1012 20:42:56.289867 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d\": container with ID starting with 1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d not found: ID does not exist" containerID="1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.289899 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d"} err="failed to get container status \"1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d\": rpc error: code = NotFound desc = could not find container \"1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d\": container with ID starting with 1d097e911a385e19502ac7efeddb917eb6f420b13bbbc44ad9e123cabc44ea9d not found: ID does not exist" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.289924 4773 scope.go:117] "RemoveContainer" containerID="9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e" Oct 12 20:42:56 crc kubenswrapper[4773]: E1012 20:42:56.290165 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e\": container with ID starting with 9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e not found: ID does not exist" containerID="9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.290188 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e"} err="failed to get container status \"9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e\": rpc error: code = NotFound desc = could not find container \"9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e\": container with ID starting with 9e698a1070ede2430fb59755dda26db53544dfe1b6f3f391d6ee4c280ee1007e not found: ID does not exist" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.290207 4773 scope.go:117] "RemoveContainer" containerID="089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3" Oct 12 20:42:56 crc kubenswrapper[4773]: E1012 20:42:56.290473 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3\": container with ID starting with 089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3 not found: ID does not exist" containerID="089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.290500 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3"} err="failed to get container status \"089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3\": rpc error: code = NotFound desc = could not find container \"089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3\": container with ID starting with 089c5e751bbc142a6f1c6f16b7054e63e64df56df3a57263f5cb821f4c1f46d3 not found: ID does not exist" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.290516 4773 scope.go:117] "RemoveContainer" containerID="93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e" Oct 12 20:42:56 crc kubenswrapper[4773]: E1012 20:42:56.290904 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e\": container with ID starting with 93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e not found: ID does not exist" containerID="93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.290932 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e"} err="failed to get container status \"93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e\": rpc error: code = NotFound desc = could not find container \"93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e\": container with ID starting with 93532a586e61ecfffe1007e778eadd334b2f028da0dbb9301568e54f0d5c873e not found: ID does not exist" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.378619 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-run-httpd\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.378704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-scripts\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.378744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjmdr\" (UniqueName: \"kubernetes.io/projected/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-kube-api-access-bjmdr\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.378765 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.378801 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-log-httpd\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.378820 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.378837 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.378885 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-config-data\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.379279 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-log-httpd\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.384394 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-run-httpd\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.386458 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-config-data\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.387159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.389130 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.392243 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-scripts\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.405211 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjmdr\" (UniqueName: \"kubernetes.io/projected/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-kube-api-access-bjmdr\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.409170 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " pod="openstack/ceilometer-0" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.501762 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="121ee448-a59b-45ec-9e07-69f8d4e22518" path="/var/lib/kubelet/pods/121ee448-a59b-45ec-9e07-69f8d4e22518/volumes" Oct 12 20:42:56 crc kubenswrapper[4773]: I1012 20:42:56.529376 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:42:57 crc kubenswrapper[4773]: I1012 20:42:57.119417 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:42:57 crc kubenswrapper[4773]: I1012 20:42:57.139598 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerStarted","Data":"44ddaabe212e3f3ece617b2507274373d7849264f66414f8660ac277f4636700"} Oct 12 20:42:58 crc kubenswrapper[4773]: I1012 20:42:58.150288 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerStarted","Data":"6f1d0805b7c571cbf8684b8628e825ebc8fd8468cb1b4396fababad15644fe44"} Oct 12 20:42:58 crc kubenswrapper[4773]: I1012 20:42:58.537114 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 12 20:42:58 crc kubenswrapper[4773]: I1012 20:42:58.669598 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:42:58 crc kubenswrapper[4773]: I1012 20:42:58.669896 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:42:58 crc kubenswrapper[4773]: I1012 20:42:58.670046 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:42:58 crc kubenswrapper[4773]: I1012 20:42:58.670735 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0933e9f4241f82c41af0f2d2f4870feff1ad7b281c06f5be9e23c636fa021737"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 20:42:58 crc kubenswrapper[4773]: I1012 20:42:58.670882 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://0933e9f4241f82c41af0f2d2f4870feff1ad7b281c06f5be9e23c636fa021737" gracePeriod=600 Oct 12 20:42:59 crc kubenswrapper[4773]: I1012 20:42:59.161547 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerStarted","Data":"135c39d48b9e074b2e5dbec49375117e64b77b183eef733dc32c37a7605160ba"} Oct 12 20:42:59 crc kubenswrapper[4773]: I1012 20:42:59.164778 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="0933e9f4241f82c41af0f2d2f4870feff1ad7b281c06f5be9e23c636fa021737" exitCode=0 Oct 12 20:42:59 crc kubenswrapper[4773]: I1012 20:42:59.164831 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"0933e9f4241f82c41af0f2d2f4870feff1ad7b281c06f5be9e23c636fa021737"} Oct 12 20:42:59 crc kubenswrapper[4773]: I1012 20:42:59.164864 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"4c2267874ca0a1e2b858f357e588e7faae20739dfdbee651abb17c4b8b4bc171"} Oct 12 20:42:59 crc kubenswrapper[4773]: I1012 20:42:59.164884 4773 scope.go:117] "RemoveContainer" containerID="eac722170f5344e043159ef0831f8b64693997069824d20f87b36a000f16f635" Oct 12 20:43:00 crc kubenswrapper[4773]: I1012 20:43:00.173378 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerStarted","Data":"f994f7c96be68c69ab5956bd6b716149591e35888e1c5fe16c40a96ed5318154"} Oct 12 20:43:00 crc kubenswrapper[4773]: I1012 20:43:00.352224 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 20:43:00 crc kubenswrapper[4773]: I1012 20:43:00.352293 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 20:43:01 crc kubenswrapper[4773]: I1012 20:43:01.184302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerStarted","Data":"c7949a83de9b6a1cd8b8c2e651c5578f48aad7023b086b6164e3640b6092e2c0"} Oct 12 20:43:01 crc kubenswrapper[4773]: I1012 20:43:01.185786 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 20:43:01 crc kubenswrapper[4773]: I1012 20:43:01.207973 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.913374711 podStartE2EDuration="5.207955947s" podCreationTimestamp="2025-10-12 20:42:56 +0000 UTC" firstStartedPulling="2025-10-12 20:42:57.131968384 +0000 UTC m=+1125.368266944" lastFinishedPulling="2025-10-12 20:43:00.42654962 +0000 UTC m=+1128.662848180" observedRunningTime="2025-10-12 20:43:01.203680598 +0000 UTC m=+1129.439979158" watchObservedRunningTime="2025-10-12 20:43:01.207955947 +0000 UTC m=+1129.444254497" Oct 12 20:43:01 crc kubenswrapper[4773]: I1012 20:43:01.361882 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 20:43:01 crc kubenswrapper[4773]: I1012 20:43:01.361882 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 20:43:02 crc kubenswrapper[4773]: I1012 20:43:02.604565 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 12 20:43:03 crc kubenswrapper[4773]: I1012 20:43:03.514306 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 20:43:03 crc kubenswrapper[4773]: I1012 20:43:03.514365 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 20:43:03 crc kubenswrapper[4773]: I1012 20:43:03.537552 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 12 20:43:03 crc kubenswrapper[4773]: I1012 20:43:03.567177 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 12 20:43:04 crc kubenswrapper[4773]: I1012 20:43:04.232285 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 12 20:43:04 crc kubenswrapper[4773]: I1012 20:43:04.597918 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 20:43:04 crc kubenswrapper[4773]: I1012 20:43:04.597940 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 20:43:10 crc kubenswrapper[4773]: I1012 20:43:10.385664 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 20:43:10 crc kubenswrapper[4773]: I1012 20:43:10.386479 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 20:43:10 crc kubenswrapper[4773]: I1012 20:43:10.395085 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 20:43:10 crc kubenswrapper[4773]: I1012 20:43:10.405417 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.094708 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.149003 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-config-data\") pod \"49792abe-3a04-4816-83ae-cdc737f4527d\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.149107 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhf5q\" (UniqueName: \"kubernetes.io/projected/49792abe-3a04-4816-83ae-cdc737f4527d-kube-api-access-vhf5q\") pod \"49792abe-3a04-4816-83ae-cdc737f4527d\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.149137 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-combined-ca-bundle\") pod \"49792abe-3a04-4816-83ae-cdc737f4527d\" (UID: \"49792abe-3a04-4816-83ae-cdc737f4527d\") " Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.157463 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49792abe-3a04-4816-83ae-cdc737f4527d-kube-api-access-vhf5q" (OuterVolumeSpecName: "kube-api-access-vhf5q") pod "49792abe-3a04-4816-83ae-cdc737f4527d" (UID: "49792abe-3a04-4816-83ae-cdc737f4527d"). InnerVolumeSpecName "kube-api-access-vhf5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.175389 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-config-data" (OuterVolumeSpecName: "config-data") pod "49792abe-3a04-4816-83ae-cdc737f4527d" (UID: "49792abe-3a04-4816-83ae-cdc737f4527d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.181846 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49792abe-3a04-4816-83ae-cdc737f4527d" (UID: "49792abe-3a04-4816-83ae-cdc737f4527d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.251620 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.251902 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhf5q\" (UniqueName: \"kubernetes.io/projected/49792abe-3a04-4816-83ae-cdc737f4527d-kube-api-access-vhf5q\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.251980 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49792abe-3a04-4816-83ae-cdc737f4527d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.280698 4773 generic.go:334] "Generic (PLEG): container finished" podID="49792abe-3a04-4816-83ae-cdc737f4527d" containerID="331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d" exitCode=137 Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.280771 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.280762 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"49792abe-3a04-4816-83ae-cdc737f4527d","Type":"ContainerDied","Data":"331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d"} Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.280937 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"49792abe-3a04-4816-83ae-cdc737f4527d","Type":"ContainerDied","Data":"374b6e3fc9e27dab47d2a336692a5efbc373deced01443d6e173fd9ba9d0c980"} Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.280975 4773 scope.go:117] "RemoveContainer" containerID="331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.306803 4773 scope.go:117] "RemoveContainer" containerID="331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d" Oct 12 20:43:11 crc kubenswrapper[4773]: E1012 20:43:11.307389 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d\": container with ID starting with 331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d not found: ID does not exist" containerID="331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.307472 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d"} err="failed to get container status \"331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d\": rpc error: code = NotFound desc = could not find container \"331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d\": container with ID starting with 331ecffcf442a7cb21640284c38caec7c7016283ea3d6e2310a46e8d4507955d not found: ID does not exist" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.310415 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.319067 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.332014 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 20:43:11 crc kubenswrapper[4773]: E1012 20:43:11.332552 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49792abe-3a04-4816-83ae-cdc737f4527d" containerName="nova-cell1-novncproxy-novncproxy" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.332623 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="49792abe-3a04-4816-83ae-cdc737f4527d" containerName="nova-cell1-novncproxy-novncproxy" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.332843 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="49792abe-3a04-4816-83ae-cdc737f4527d" containerName="nova-cell1-novncproxy-novncproxy" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.333446 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.336787 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.337327 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.337508 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.362097 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.455421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk2zz\" (UniqueName: \"kubernetes.io/projected/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-kube-api-access-mk2zz\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.455600 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.455802 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.456078 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.456247 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.558357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk2zz\" (UniqueName: \"kubernetes.io/projected/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-kube-api-access-mk2zz\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.558432 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.558466 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.558509 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.558558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.562808 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.563330 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.564045 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.569308 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.581581 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk2zz\" (UniqueName: \"kubernetes.io/projected/4f9f8baf-3f94-4e6c-b5ec-f9763330a042-kube-api-access-mk2zz\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f9f8baf-3f94-4e6c-b5ec-f9763330a042\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:11 crc kubenswrapper[4773]: I1012 20:43:11.649268 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:12 crc kubenswrapper[4773]: I1012 20:43:12.208137 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 20:43:12 crc kubenswrapper[4773]: I1012 20:43:12.291664 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f9f8baf-3f94-4e6c-b5ec-f9763330a042","Type":"ContainerStarted","Data":"88dee4ccfd6f0e577e3f592ff234c1d29df0965bf682b128c195b60228660b26"} Oct 12 20:43:12 crc kubenswrapper[4773]: I1012 20:43:12.492169 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49792abe-3a04-4816-83ae-cdc737f4527d" path="/var/lib/kubelet/pods/49792abe-3a04-4816-83ae-cdc737f4527d/volumes" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.304346 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f9f8baf-3f94-4e6c-b5ec-f9763330a042","Type":"ContainerStarted","Data":"9a1047707af114c7ccd03a0827b45f40cdaea2ea9e97a74a25966701fdff0743"} Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.373626 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.373606546 podStartE2EDuration="2.373606546s" podCreationTimestamp="2025-10-12 20:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:43:13.361961032 +0000 UTC m=+1141.598259602" watchObservedRunningTime="2025-10-12 20:43:13.373606546 +0000 UTC m=+1141.609905116" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.517581 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.518038 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.518334 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.518369 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.521532 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.521870 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.766656 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-665946c669-gtbdj"] Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.768470 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.799103 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-665946c669-gtbdj"] Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.807821 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-nb\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.807900 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-config\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.807935 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-sb\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.807962 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-dns-svc\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.807985 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkjd4\" (UniqueName: \"kubernetes.io/projected/526df519-a931-4d53-b729-3256ced8c813-kube-api-access-mkjd4\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.909186 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-nb\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.909266 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-config\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.909298 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-sb\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.909325 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-dns-svc\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.909342 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkjd4\" (UniqueName: \"kubernetes.io/projected/526df519-a931-4d53-b729-3256ced8c813-kube-api-access-mkjd4\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.910330 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-config\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.910382 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-dns-svc\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.910456 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-sb\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.910925 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-nb\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:13 crc kubenswrapper[4773]: I1012 20:43:13.960005 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkjd4\" (UniqueName: \"kubernetes.io/projected/526df519-a931-4d53-b729-3256ced8c813-kube-api-access-mkjd4\") pod \"dnsmasq-dns-665946c669-gtbdj\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:14 crc kubenswrapper[4773]: I1012 20:43:14.095836 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:14 crc kubenswrapper[4773]: I1012 20:43:14.619636 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-665946c669-gtbdj"] Oct 12 20:43:15 crc kubenswrapper[4773]: I1012 20:43:15.358527 4773 generic.go:334] "Generic (PLEG): container finished" podID="526df519-a931-4d53-b729-3256ced8c813" containerID="ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c" exitCode=0 Oct 12 20:43:15 crc kubenswrapper[4773]: I1012 20:43:15.361310 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665946c669-gtbdj" event={"ID":"526df519-a931-4d53-b729-3256ced8c813","Type":"ContainerDied","Data":"ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c"} Oct 12 20:43:15 crc kubenswrapper[4773]: I1012 20:43:15.361402 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665946c669-gtbdj" event={"ID":"526df519-a931-4d53-b729-3256ced8c813","Type":"ContainerStarted","Data":"2ff8ea2c34f8916e884848f261b439dad0a5f2e1e475965ef1cd5e464d0578a2"} Oct 12 20:43:15 crc kubenswrapper[4773]: I1012 20:43:15.477944 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:43:15 crc kubenswrapper[4773]: I1012 20:43:15.478485 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="ceilometer-central-agent" containerID="cri-o://6f1d0805b7c571cbf8684b8628e825ebc8fd8468cb1b4396fababad15644fe44" gracePeriod=30 Oct 12 20:43:15 crc kubenswrapper[4773]: I1012 20:43:15.479449 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="proxy-httpd" containerID="cri-o://c7949a83de9b6a1cd8b8c2e651c5578f48aad7023b086b6164e3640b6092e2c0" gracePeriod=30 Oct 12 20:43:15 crc kubenswrapper[4773]: I1012 20:43:15.479583 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="sg-core" containerID="cri-o://f994f7c96be68c69ab5956bd6b716149591e35888e1c5fe16c40a96ed5318154" gracePeriod=30 Oct 12 20:43:15 crc kubenswrapper[4773]: I1012 20:43:15.479834 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="ceilometer-notification-agent" containerID="cri-o://135c39d48b9e074b2e5dbec49375117e64b77b183eef733dc32c37a7605160ba" gracePeriod=30 Oct 12 20:43:15 crc kubenswrapper[4773]: I1012 20:43:15.512937 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.180:3000/\": read tcp 10.217.0.2:48554->10.217.0.180:3000: read: connection reset by peer" Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.370466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665946c669-gtbdj" event={"ID":"526df519-a931-4d53-b729-3256ced8c813","Type":"ContainerStarted","Data":"a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32"} Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.370981 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.373123 4773 generic.go:334] "Generic (PLEG): container finished" podID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerID="c7949a83de9b6a1cd8b8c2e651c5578f48aad7023b086b6164e3640b6092e2c0" exitCode=0 Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.373150 4773 generic.go:334] "Generic (PLEG): container finished" podID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerID="f994f7c96be68c69ab5956bd6b716149591e35888e1c5fe16c40a96ed5318154" exitCode=2 Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.373159 4773 generic.go:334] "Generic (PLEG): container finished" podID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerID="6f1d0805b7c571cbf8684b8628e825ebc8fd8468cb1b4396fababad15644fe44" exitCode=0 Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.373179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerDied","Data":"c7949a83de9b6a1cd8b8c2e651c5578f48aad7023b086b6164e3640b6092e2c0"} Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.373197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerDied","Data":"f994f7c96be68c69ab5956bd6b716149591e35888e1c5fe16c40a96ed5318154"} Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.373211 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerDied","Data":"6f1d0805b7c571cbf8684b8628e825ebc8fd8468cb1b4396fababad15644fe44"} Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.399412 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-665946c669-gtbdj" podStartSLOduration=3.399383157 podStartE2EDuration="3.399383157s" podCreationTimestamp="2025-10-12 20:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:43:16.390422958 +0000 UTC m=+1144.626721518" watchObservedRunningTime="2025-10-12 20:43:16.399383157 +0000 UTC m=+1144.635681717" Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.650066 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.684098 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.684331 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-log" containerID="cri-o://c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae" gracePeriod=30 Oct 12 20:43:16 crc kubenswrapper[4773]: I1012 20:43:16.684453 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-api" containerID="cri-o://8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f" gracePeriod=30 Oct 12 20:43:17 crc kubenswrapper[4773]: I1012 20:43:17.409875 4773 generic.go:334] "Generic (PLEG): container finished" podID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerID="c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae" exitCode=143 Oct 12 20:43:17 crc kubenswrapper[4773]: I1012 20:43:17.409955 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b84df2-e013-45e9-8487-00e3147f0a2b","Type":"ContainerDied","Data":"c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae"} Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.250145 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.366147 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsf86\" (UniqueName: \"kubernetes.io/projected/40b84df2-e013-45e9-8487-00e3147f0a2b-kube-api-access-lsf86\") pod \"40b84df2-e013-45e9-8487-00e3147f0a2b\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.366322 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-config-data\") pod \"40b84df2-e013-45e9-8487-00e3147f0a2b\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.366915 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-combined-ca-bundle\") pod \"40b84df2-e013-45e9-8487-00e3147f0a2b\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.367233 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b84df2-e013-45e9-8487-00e3147f0a2b-logs\") pod \"40b84df2-e013-45e9-8487-00e3147f0a2b\" (UID: \"40b84df2-e013-45e9-8487-00e3147f0a2b\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.371210 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b84df2-e013-45e9-8487-00e3147f0a2b-logs" (OuterVolumeSpecName: "logs") pod "40b84df2-e013-45e9-8487-00e3147f0a2b" (UID: "40b84df2-e013-45e9-8487-00e3147f0a2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.379441 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b84df2-e013-45e9-8487-00e3147f0a2b-kube-api-access-lsf86" (OuterVolumeSpecName: "kube-api-access-lsf86") pod "40b84df2-e013-45e9-8487-00e3147f0a2b" (UID: "40b84df2-e013-45e9-8487-00e3147f0a2b"). InnerVolumeSpecName "kube-api-access-lsf86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.414479 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-config-data" (OuterVolumeSpecName: "config-data") pod "40b84df2-e013-45e9-8487-00e3147f0a2b" (UID: "40b84df2-e013-45e9-8487-00e3147f0a2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.444818 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40b84df2-e013-45e9-8487-00e3147f0a2b" (UID: "40b84df2-e013-45e9-8487-00e3147f0a2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.461169 4773 generic.go:334] "Generic (PLEG): container finished" podID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerID="8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f" exitCode=0 Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.461244 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b84df2-e013-45e9-8487-00e3147f0a2b","Type":"ContainerDied","Data":"8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f"} Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.461274 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b84df2-e013-45e9-8487-00e3147f0a2b","Type":"ContainerDied","Data":"c6aab2af9f95152cefe7ab6e3142b80997ea327400c54daf13281007d98ad00a"} Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.461293 4773 scope.go:117] "RemoveContainer" containerID="8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.461436 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.482389 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.483187 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b84df2-e013-45e9-8487-00e3147f0a2b-logs\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.483355 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsf86\" (UniqueName: \"kubernetes.io/projected/40b84df2-e013-45e9-8487-00e3147f0a2b-kube-api-access-lsf86\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.483426 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b84df2-e013-45e9-8487-00e3147f0a2b-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.490210 4773 generic.go:334] "Generic (PLEG): container finished" podID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerID="135c39d48b9e074b2e5dbec49375117e64b77b183eef733dc32c37a7605160ba" exitCode=0 Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.515210 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerDied","Data":"135c39d48b9e074b2e5dbec49375117e64b77b183eef733dc32c37a7605160ba"} Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.539273 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.549417 4773 scope.go:117] "RemoveContainer" containerID="c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.571855 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.584007 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:20 crc kubenswrapper[4773]: E1012 20:43:20.586545 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-log" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.586569 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-log" Oct 12 20:43:20 crc kubenswrapper[4773]: E1012 20:43:20.586588 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-api" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.586596 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-api" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.586811 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-log" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.586834 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" containerName="nova-api-api" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.588619 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.594616 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.594638 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.594923 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.595242 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.602927 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.608991 4773 scope.go:117] "RemoveContainer" containerID="8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f" Oct 12 20:43:20 crc kubenswrapper[4773]: E1012 20:43:20.611250 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f\": container with ID starting with 8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f not found: ID does not exist" containerID="8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.611301 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f"} err="failed to get container status \"8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f\": rpc error: code = NotFound desc = could not find container \"8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f\": container with ID starting with 8b422b0459cef7483d70bf0cfb4443599ea2f4ef30225b14095b32bedc69f53f not found: ID does not exist" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.611336 4773 scope.go:117] "RemoveContainer" containerID="c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae" Oct 12 20:43:20 crc kubenswrapper[4773]: E1012 20:43:20.612296 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae\": container with ID starting with c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae not found: ID does not exist" containerID="c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.612323 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae"} err="failed to get container status \"c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae\": rpc error: code = NotFound desc = could not find container \"c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae\": container with ID starting with c80df8dbc709de83b8070c76ad3a2fdeb8957ca627d5646ef2c4545c077dfdae not found: ID does not exist" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.686604 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-combined-ca-bundle\") pod \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.686679 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-ceilometer-tls-certs\") pod \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.686774 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-sg-core-conf-yaml\") pod \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.686837 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-scripts\") pod \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.686952 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-run-httpd\") pod \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.687000 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-log-httpd\") pod \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.687082 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjmdr\" (UniqueName: \"kubernetes.io/projected/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-kube-api-access-bjmdr\") pod \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.687099 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-config-data\") pod \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\" (UID: \"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0\") " Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.687468 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.687537 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-config-data\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.687579 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.687622 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrpzm\" (UniqueName: \"kubernetes.io/projected/e2f29efb-8a93-4806-a572-03ec784612a9-kube-api-access-mrpzm\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.687655 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f29efb-8a93-4806-a572-03ec784612a9-logs\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.687679 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.688669 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" (UID: "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.689063 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" (UID: "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.690478 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-scripts" (OuterVolumeSpecName: "scripts") pod "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" (UID: "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.692038 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-kube-api-access-bjmdr" (OuterVolumeSpecName: "kube-api-access-bjmdr") pod "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" (UID: "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0"). InnerVolumeSpecName "kube-api-access-bjmdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.737940 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" (UID: "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.743392 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" (UID: "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.788931 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f29efb-8a93-4806-a572-03ec784612a9-logs\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.788984 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.789046 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.789106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-config-data\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.789141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.789184 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrpzm\" (UniqueName: \"kubernetes.io/projected/e2f29efb-8a93-4806-a572-03ec784612a9-kube-api-access-mrpzm\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.789421 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f29efb-8a93-4806-a572-03ec784612a9-logs\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.791161 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjmdr\" (UniqueName: \"kubernetes.io/projected/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-kube-api-access-bjmdr\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.791237 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.791252 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.791265 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.791289 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.791303 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.791348 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" (UID: "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.794690 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.796074 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.796539 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-config-data\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.798774 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.814786 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrpzm\" (UniqueName: \"kubernetes.io/projected/e2f29efb-8a93-4806-a572-03ec784612a9-kube-api-access-mrpzm\") pod \"nova-api-0\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " pod="openstack/nova-api-0" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.817995 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-config-data" (OuterVolumeSpecName: "config-data") pod "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" (UID: "f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.893115 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.893385 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:20 crc kubenswrapper[4773]: I1012 20:43:20.919139 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.396887 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.502327 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2f29efb-8a93-4806-a572-03ec784612a9","Type":"ContainerStarted","Data":"e889a542126e0fdfbf5596ca90bad77d7f7391b2db555fbd64e0d8d61eb51609"} Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.508051 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0","Type":"ContainerDied","Data":"44ddaabe212e3f3ece617b2507274373d7849264f66414f8660ac277f4636700"} Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.508303 4773 scope.go:117] "RemoveContainer" containerID="c7949a83de9b6a1cd8b8c2e651c5578f48aad7023b086b6164e3640b6092e2c0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.508130 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.541708 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.549053 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.561101 4773 scope.go:117] "RemoveContainer" containerID="f994f7c96be68c69ab5956bd6b716149591e35888e1c5fe16c40a96ed5318154" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.564030 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:43:21 crc kubenswrapper[4773]: E1012 20:43:21.564379 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="ceilometer-central-agent" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.564393 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="ceilometer-central-agent" Oct 12 20:43:21 crc kubenswrapper[4773]: E1012 20:43:21.564424 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="proxy-httpd" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.564431 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="proxy-httpd" Oct 12 20:43:21 crc kubenswrapper[4773]: E1012 20:43:21.564444 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="sg-core" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.564450 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="sg-core" Oct 12 20:43:21 crc kubenswrapper[4773]: E1012 20:43:21.564459 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="ceilometer-notification-agent" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.564465 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="ceilometer-notification-agent" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.564610 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="sg-core" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.564636 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="proxy-httpd" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.564649 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="ceilometer-notification-agent" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.564661 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" containerName="ceilometer-central-agent" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.567164 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.570445 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.570625 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.570761 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.591855 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.610388 4773 scope.go:117] "RemoveContainer" containerID="135c39d48b9e074b2e5dbec49375117e64b77b183eef733dc32c37a7605160ba" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.611466 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-config-data\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.611572 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.611635 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.611663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.611737 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh87q\" (UniqueName: \"kubernetes.io/projected/37da9366-2055-43bd-83d0-cab5606dec64-kube-api-access-jh87q\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.611763 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-run-httpd\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.611780 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-log-httpd\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.611844 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-scripts\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.639121 4773 scope.go:117] "RemoveContainer" containerID="6f1d0805b7c571cbf8684b8628e825ebc8fd8468cb1b4396fababad15644fe44" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.650169 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.677153 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.717396 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh87q\" (UniqueName: \"kubernetes.io/projected/37da9366-2055-43bd-83d0-cab5606dec64-kube-api-access-jh87q\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.717466 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-run-httpd\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.717493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-log-httpd\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.717552 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-scripts\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.717671 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-config-data\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.717711 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.717797 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.717824 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.719783 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-run-httpd\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.720100 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-log-httpd\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.728240 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.729059 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-scripts\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.732054 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.734678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-config-data\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.735778 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh87q\" (UniqueName: \"kubernetes.io/projected/37da9366-2055-43bd-83d0-cab5606dec64-kube-api-access-jh87q\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.739142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " pod="openstack/ceilometer-0" Oct 12 20:43:21 crc kubenswrapper[4773]: I1012 20:43:21.888317 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.387867 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.492380 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b84df2-e013-45e9-8487-00e3147f0a2b" path="/var/lib/kubelet/pods/40b84df2-e013-45e9-8487-00e3147f0a2b/volumes" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.493135 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0" path="/var/lib/kubelet/pods/f37b4d26-fee3-4ac3-b3df-5b3ddda59ab0/volumes" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.535394 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2f29efb-8a93-4806-a572-03ec784612a9","Type":"ContainerStarted","Data":"54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180"} Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.535440 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2f29efb-8a93-4806-a572-03ec784612a9","Type":"ContainerStarted","Data":"d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8"} Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.542971 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerStarted","Data":"3e8b89f7e8ac34ff048eea1378a5c455138047a0c0b0a1a97b3cc66d98ba1017"} Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.559411 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.559392502 podStartE2EDuration="2.559392502s" podCreationTimestamp="2025-10-12 20:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:43:22.549895968 +0000 UTC m=+1150.786194538" watchObservedRunningTime="2025-10-12 20:43:22.559392502 +0000 UTC m=+1150.795691062" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.570129 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.860205 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vhgr7"] Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.861287 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.867517 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.867672 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.892991 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vhgr7"] Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.956735 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.957095 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-config-data\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.957149 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-scripts\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:22 crc kubenswrapper[4773]: I1012 20:43:22.957170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7tnt\" (UniqueName: \"kubernetes.io/projected/439aaeb2-8cab-4025-a89b-33fda13b4c5d-kube-api-access-b7tnt\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.058199 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-scripts\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.058240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7tnt\" (UniqueName: \"kubernetes.io/projected/439aaeb2-8cab-4025-a89b-33fda13b4c5d-kube-api-access-b7tnt\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.058325 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.058364 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-config-data\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.064641 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-scripts\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.066441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-config-data\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.077214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7tnt\" (UniqueName: \"kubernetes.io/projected/439aaeb2-8cab-4025-a89b-33fda13b4c5d-kube-api-access-b7tnt\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.078279 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vhgr7\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.274051 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.552186 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerStarted","Data":"6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7"} Oct 12 20:43:23 crc kubenswrapper[4773]: I1012 20:43:23.757235 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vhgr7"] Oct 12 20:43:23 crc kubenswrapper[4773]: W1012 20:43:23.760321 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod439aaeb2_8cab_4025_a89b_33fda13b4c5d.slice/crio-4c5305bb513236dbc1d173cf9ea2745cc5a92237b25d8ad03b5ea7917b4f09e2 WatchSource:0}: Error finding container 4c5305bb513236dbc1d173cf9ea2745cc5a92237b25d8ad03b5ea7917b4f09e2: Status 404 returned error can't find the container with id 4c5305bb513236dbc1d173cf9ea2745cc5a92237b25d8ad03b5ea7917b4f09e2 Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.097848 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.159818 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75fb48c489-gzvml"] Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.166082 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" podUID="f9e044cb-617e-4470-bdd0-d7a28f2d2a63" containerName="dnsmasq-dns" containerID="cri-o://ed73eaf36fbee7b2df707c4dbee48e0f25dcd2138030a9fb86d027b6005457d4" gracePeriod=10 Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.565110 4773 generic.go:334] "Generic (PLEG): container finished" podID="f9e044cb-617e-4470-bdd0-d7a28f2d2a63" containerID="ed73eaf36fbee7b2df707c4dbee48e0f25dcd2138030a9fb86d027b6005457d4" exitCode=0 Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.565397 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" event={"ID":"f9e044cb-617e-4470-bdd0-d7a28f2d2a63","Type":"ContainerDied","Data":"ed73eaf36fbee7b2df707c4dbee48e0f25dcd2138030a9fb86d027b6005457d4"} Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.567704 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vhgr7" event={"ID":"439aaeb2-8cab-4025-a89b-33fda13b4c5d","Type":"ContainerStarted","Data":"63728d2f1ddff77313bc16ed7b29c60dd1cf7a896d5e0ae60b66500db4ef5f25"} Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.568418 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vhgr7" event={"ID":"439aaeb2-8cab-4025-a89b-33fda13b4c5d","Type":"ContainerStarted","Data":"4c5305bb513236dbc1d173cf9ea2745cc5a92237b25d8ad03b5ea7917b4f09e2"} Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.577513 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerStarted","Data":"57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f"} Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.577784 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerStarted","Data":"56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a"} Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.587882 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vhgr7" podStartSLOduration=2.587826562 podStartE2EDuration="2.587826562s" podCreationTimestamp="2025-10-12 20:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:43:24.582330199 +0000 UTC m=+1152.818628759" watchObservedRunningTime="2025-10-12 20:43:24.587826562 +0000 UTC m=+1152.824125122" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.647950 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.687372 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-nb\") pod \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.687462 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bc4g\" (UniqueName: \"kubernetes.io/projected/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-kube-api-access-5bc4g\") pod \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.687517 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-sb\") pod \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.687556 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-dns-svc\") pod \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.687651 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-config\") pod \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\" (UID: \"f9e044cb-617e-4470-bdd0-d7a28f2d2a63\") " Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.697014 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-kube-api-access-5bc4g" (OuterVolumeSpecName: "kube-api-access-5bc4g") pod "f9e044cb-617e-4470-bdd0-d7a28f2d2a63" (UID: "f9e044cb-617e-4470-bdd0-d7a28f2d2a63"). InnerVolumeSpecName "kube-api-access-5bc4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.775214 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-config" (OuterVolumeSpecName: "config") pod "f9e044cb-617e-4470-bdd0-d7a28f2d2a63" (UID: "f9e044cb-617e-4470-bdd0-d7a28f2d2a63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.775257 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9e044cb-617e-4470-bdd0-d7a28f2d2a63" (UID: "f9e044cb-617e-4470-bdd0-d7a28f2d2a63"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.775383 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9e044cb-617e-4470-bdd0-d7a28f2d2a63" (UID: "f9e044cb-617e-4470-bdd0-d7a28f2d2a63"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.782398 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9e044cb-617e-4470-bdd0-d7a28f2d2a63" (UID: "f9e044cb-617e-4470-bdd0-d7a28f2d2a63"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.790168 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.790205 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bc4g\" (UniqueName: \"kubernetes.io/projected/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-kube-api-access-5bc4g\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.790218 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.790228 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:24 crc kubenswrapper[4773]: I1012 20:43:24.790240 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e044cb-617e-4470-bdd0-d7a28f2d2a63-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:25 crc kubenswrapper[4773]: I1012 20:43:25.594380 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" Oct 12 20:43:25 crc kubenswrapper[4773]: I1012 20:43:25.596049 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fb48c489-gzvml" event={"ID":"f9e044cb-617e-4470-bdd0-d7a28f2d2a63","Type":"ContainerDied","Data":"e6da95be093cfec1ebb2893ebffbd39eb9379ed806832dd73089216a2964bb13"} Oct 12 20:43:25 crc kubenswrapper[4773]: I1012 20:43:25.596183 4773 scope.go:117] "RemoveContainer" containerID="ed73eaf36fbee7b2df707c4dbee48e0f25dcd2138030a9fb86d027b6005457d4" Oct 12 20:43:25 crc kubenswrapper[4773]: I1012 20:43:25.691101 4773 scope.go:117] "RemoveContainer" containerID="122ee3050c4c2ac28f95a1769be0784635b524fb27bb3542123edb6b67ec2757" Oct 12 20:43:25 crc kubenswrapper[4773]: I1012 20:43:25.723470 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75fb48c489-gzvml"] Oct 12 20:43:25 crc kubenswrapper[4773]: I1012 20:43:25.733041 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75fb48c489-gzvml"] Oct 12 20:43:26 crc kubenswrapper[4773]: I1012 20:43:26.489783 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e044cb-617e-4470-bdd0-d7a28f2d2a63" path="/var/lib/kubelet/pods/f9e044cb-617e-4470-bdd0-d7a28f2d2a63/volumes" Oct 12 20:43:26 crc kubenswrapper[4773]: I1012 20:43:26.603450 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerStarted","Data":"bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2"} Oct 12 20:43:26 crc kubenswrapper[4773]: I1012 20:43:26.603596 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 20:43:29 crc kubenswrapper[4773]: I1012 20:43:29.635684 4773 generic.go:334] "Generic (PLEG): container finished" podID="439aaeb2-8cab-4025-a89b-33fda13b4c5d" containerID="63728d2f1ddff77313bc16ed7b29c60dd1cf7a896d5e0ae60b66500db4ef5f25" exitCode=0 Oct 12 20:43:29 crc kubenswrapper[4773]: I1012 20:43:29.635798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vhgr7" event={"ID":"439aaeb2-8cab-4025-a89b-33fda13b4c5d","Type":"ContainerDied","Data":"63728d2f1ddff77313bc16ed7b29c60dd1cf7a896d5e0ae60b66500db4ef5f25"} Oct 12 20:43:29 crc kubenswrapper[4773]: I1012 20:43:29.654037 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.503831995 podStartE2EDuration="8.654022175s" podCreationTimestamp="2025-10-12 20:43:21 +0000 UTC" firstStartedPulling="2025-10-12 20:43:22.397046248 +0000 UTC m=+1150.633344818" lastFinishedPulling="2025-10-12 20:43:25.547236438 +0000 UTC m=+1153.783534998" observedRunningTime="2025-10-12 20:43:26.631185847 +0000 UTC m=+1154.867484407" watchObservedRunningTime="2025-10-12 20:43:29.654022175 +0000 UTC m=+1157.890320735" Oct 12 20:43:30 crc kubenswrapper[4773]: I1012 20:43:30.919939 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 20:43:30 crc kubenswrapper[4773]: I1012 20:43:30.920625 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.089662 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.096390 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-combined-ca-bundle\") pod \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.096613 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7tnt\" (UniqueName: \"kubernetes.io/projected/439aaeb2-8cab-4025-a89b-33fda13b4c5d-kube-api-access-b7tnt\") pod \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.096741 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-scripts\") pod \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.096893 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-config-data\") pod \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\" (UID: \"439aaeb2-8cab-4025-a89b-33fda13b4c5d\") " Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.116634 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-scripts" (OuterVolumeSpecName: "scripts") pod "439aaeb2-8cab-4025-a89b-33fda13b4c5d" (UID: "439aaeb2-8cab-4025-a89b-33fda13b4c5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.116821 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439aaeb2-8cab-4025-a89b-33fda13b4c5d-kube-api-access-b7tnt" (OuterVolumeSpecName: "kube-api-access-b7tnt") pod "439aaeb2-8cab-4025-a89b-33fda13b4c5d" (UID: "439aaeb2-8cab-4025-a89b-33fda13b4c5d"). InnerVolumeSpecName "kube-api-access-b7tnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.155655 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-config-data" (OuterVolumeSpecName: "config-data") pod "439aaeb2-8cab-4025-a89b-33fda13b4c5d" (UID: "439aaeb2-8cab-4025-a89b-33fda13b4c5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.167864 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "439aaeb2-8cab-4025-a89b-33fda13b4c5d" (UID: "439aaeb2-8cab-4025-a89b-33fda13b4c5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.199641 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.199767 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7tnt\" (UniqueName: \"kubernetes.io/projected/439aaeb2-8cab-4025-a89b-33fda13b4c5d-kube-api-access-b7tnt\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.199926 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.199999 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/439aaeb2-8cab-4025-a89b-33fda13b4c5d-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.653525 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vhgr7" event={"ID":"439aaeb2-8cab-4025-a89b-33fda13b4c5d","Type":"ContainerDied","Data":"4c5305bb513236dbc1d173cf9ea2745cc5a92237b25d8ad03b5ea7917b4f09e2"} Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.653922 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c5305bb513236dbc1d173cf9ea2745cc5a92237b25d8ad03b5ea7917b4f09e2" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.653542 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vhgr7" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.797878 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.798175 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ec48c603-e42d-4f2d-b884-48d90cfeb1f6" containerName="nova-scheduler-scheduler" containerID="cri-o://6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0" gracePeriod=30 Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.816201 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.816455 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-log" containerID="cri-o://d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8" gracePeriod=30 Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.816600 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-api" containerID="cri-o://54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180" gracePeriod=30 Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.821000 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.183:8774/\": EOF" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.821051 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.183:8774/\": EOF" Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.878147 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.878683 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-log" containerID="cri-o://a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060" gracePeriod=30 Oct 12 20:43:31 crc kubenswrapper[4773]: I1012 20:43:31.879050 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-metadata" containerID="cri-o://33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca" gracePeriod=30 Oct 12 20:43:32 crc kubenswrapper[4773]: I1012 20:43:32.669874 4773 generic.go:334] "Generic (PLEG): container finished" podID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerID="a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060" exitCode=143 Oct 12 20:43:32 crc kubenswrapper[4773]: I1012 20:43:32.669932 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0","Type":"ContainerDied","Data":"a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060"} Oct 12 20:43:32 crc kubenswrapper[4773]: I1012 20:43:32.672541 4773 generic.go:334] "Generic (PLEG): container finished" podID="e2f29efb-8a93-4806-a572-03ec784612a9" containerID="d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8" exitCode=143 Oct 12 20:43:32 crc kubenswrapper[4773]: I1012 20:43:32.672565 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2f29efb-8a93-4806-a572-03ec784612a9","Type":"ContainerDied","Data":"d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8"} Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.345474 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.447021 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p56mv\" (UniqueName: \"kubernetes.io/projected/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-kube-api-access-p56mv\") pod \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.447192 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-config-data\") pod \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.447321 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-combined-ca-bundle\") pod \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\" (UID: \"ec48c603-e42d-4f2d-b884-48d90cfeb1f6\") " Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.465486 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-kube-api-access-p56mv" (OuterVolumeSpecName: "kube-api-access-p56mv") pod "ec48c603-e42d-4f2d-b884-48d90cfeb1f6" (UID: "ec48c603-e42d-4f2d-b884-48d90cfeb1f6"). InnerVolumeSpecName "kube-api-access-p56mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.482662 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec48c603-e42d-4f2d-b884-48d90cfeb1f6" (UID: "ec48c603-e42d-4f2d-b884-48d90cfeb1f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.490626 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-config-data" (OuterVolumeSpecName: "config-data") pod "ec48c603-e42d-4f2d-b884-48d90cfeb1f6" (UID: "ec48c603-e42d-4f2d-b884-48d90cfeb1f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.549684 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.549732 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p56mv\" (UniqueName: \"kubernetes.io/projected/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-kube-api-access-p56mv\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.549743 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec48c603-e42d-4f2d-b884-48d90cfeb1f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.687313 4773 generic.go:334] "Generic (PLEG): container finished" podID="ec48c603-e42d-4f2d-b884-48d90cfeb1f6" containerID="6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0" exitCode=0 Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.687517 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec48c603-e42d-4f2d-b884-48d90cfeb1f6","Type":"ContainerDied","Data":"6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0"} Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.688638 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec48c603-e42d-4f2d-b884-48d90cfeb1f6","Type":"ContainerDied","Data":"8c7748f9ef796efdaef25d11952ad2ac8c63d7391e5cd899277b68e2efceb932"} Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.688797 4773 scope.go:117] "RemoveContainer" containerID="6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.687572 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.718148 4773 scope.go:117] "RemoveContainer" containerID="6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0" Oct 12 20:43:33 crc kubenswrapper[4773]: E1012 20:43:33.719291 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0\": container with ID starting with 6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0 not found: ID does not exist" containerID="6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.719320 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0"} err="failed to get container status \"6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0\": rpc error: code = NotFound desc = could not find container \"6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0\": container with ID starting with 6cd01ce5abd083088d7d90d15db8c872654f30d218f99b10bec6a24bea4069c0 not found: ID does not exist" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.724746 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.733438 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.762281 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:43:33 crc kubenswrapper[4773]: E1012 20:43:33.762658 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439aaeb2-8cab-4025-a89b-33fda13b4c5d" containerName="nova-manage" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.762674 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="439aaeb2-8cab-4025-a89b-33fda13b4c5d" containerName="nova-manage" Oct 12 20:43:33 crc kubenswrapper[4773]: E1012 20:43:33.762691 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e044cb-617e-4470-bdd0-d7a28f2d2a63" containerName="dnsmasq-dns" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.762698 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e044cb-617e-4470-bdd0-d7a28f2d2a63" containerName="dnsmasq-dns" Oct 12 20:43:33 crc kubenswrapper[4773]: E1012 20:43:33.762708 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e044cb-617e-4470-bdd0-d7a28f2d2a63" containerName="init" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.762725 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e044cb-617e-4470-bdd0-d7a28f2d2a63" containerName="init" Oct 12 20:43:33 crc kubenswrapper[4773]: E1012 20:43:33.762735 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec48c603-e42d-4f2d-b884-48d90cfeb1f6" containerName="nova-scheduler-scheduler" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.762741 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec48c603-e42d-4f2d-b884-48d90cfeb1f6" containerName="nova-scheduler-scheduler" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.762898 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec48c603-e42d-4f2d-b884-48d90cfeb1f6" containerName="nova-scheduler-scheduler" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.762912 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e044cb-617e-4470-bdd0-d7a28f2d2a63" containerName="dnsmasq-dns" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.762929 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="439aaeb2-8cab-4025-a89b-33fda13b4c5d" containerName="nova-manage" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.763475 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.768165 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.783788 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.956866 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4ee762-fd56-496f-860b-89201215948c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd4ee762-fd56-496f-860b-89201215948c\") " pod="openstack/nova-scheduler-0" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.956908 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4ee762-fd56-496f-860b-89201215948c-config-data\") pod \"nova-scheduler-0\" (UID: \"bd4ee762-fd56-496f-860b-89201215948c\") " pod="openstack/nova-scheduler-0" Oct 12 20:43:33 crc kubenswrapper[4773]: I1012 20:43:33.956962 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scm68\" (UniqueName: \"kubernetes.io/projected/bd4ee762-fd56-496f-860b-89201215948c-kube-api-access-scm68\") pod \"nova-scheduler-0\" (UID: \"bd4ee762-fd56-496f-860b-89201215948c\") " pod="openstack/nova-scheduler-0" Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.058411 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4ee762-fd56-496f-860b-89201215948c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd4ee762-fd56-496f-860b-89201215948c\") " pod="openstack/nova-scheduler-0" Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.058454 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4ee762-fd56-496f-860b-89201215948c-config-data\") pod \"nova-scheduler-0\" (UID: \"bd4ee762-fd56-496f-860b-89201215948c\") " pod="openstack/nova-scheduler-0" Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.058504 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scm68\" (UniqueName: \"kubernetes.io/projected/bd4ee762-fd56-496f-860b-89201215948c-kube-api-access-scm68\") pod \"nova-scheduler-0\" (UID: \"bd4ee762-fd56-496f-860b-89201215948c\") " pod="openstack/nova-scheduler-0" Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.070846 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4ee762-fd56-496f-860b-89201215948c-config-data\") pod \"nova-scheduler-0\" (UID: \"bd4ee762-fd56-496f-860b-89201215948c\") " pod="openstack/nova-scheduler-0" Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.070992 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4ee762-fd56-496f-860b-89201215948c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd4ee762-fd56-496f-860b-89201215948c\") " pod="openstack/nova-scheduler-0" Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.074539 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scm68\" (UniqueName: \"kubernetes.io/projected/bd4ee762-fd56-496f-860b-89201215948c-kube-api-access-scm68\") pod \"nova-scheduler-0\" (UID: \"bd4ee762-fd56-496f-860b-89201215948c\") " pod="openstack/nova-scheduler-0" Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.100584 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.495541 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec48c603-e42d-4f2d-b884-48d90cfeb1f6" path="/var/lib/kubelet/pods/ec48c603-e42d-4f2d-b884-48d90cfeb1f6/volumes" Oct 12 20:43:34 crc kubenswrapper[4773]: W1012 20:43:34.641514 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4ee762_fd56_496f_860b_89201215948c.slice/crio-93442bf0f378f8a759286f2a92266ad67ce6b80860e9a814d04b16ec053aaad5 WatchSource:0}: Error finding container 93442bf0f378f8a759286f2a92266ad67ce6b80860e9a814d04b16ec053aaad5: Status 404 returned error can't find the container with id 93442bf0f378f8a759286f2a92266ad67ce6b80860e9a814d04b16ec053aaad5 Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.649382 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 20:43:34 crc kubenswrapper[4773]: I1012 20:43:34.699340 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd4ee762-fd56-496f-860b-89201215948c","Type":"ContainerStarted","Data":"93442bf0f378f8a759286f2a92266ad67ce6b80860e9a814d04b16ec053aaad5"} Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.352670 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": dial tcp 10.217.0.176:8775: connect: connection refused" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.352705 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": dial tcp 10.217.0.176:8775: connect: connection refused" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.713093 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd4ee762-fd56-496f-860b-89201215948c","Type":"ContainerStarted","Data":"656483b4b456b2615ac2d3c21b6388d1d8b1712e2216529a88c6a5e70d51cbe3"} Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.713486 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.715262 4773 generic.go:334] "Generic (PLEG): container finished" podID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerID="33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca" exitCode=0 Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.715290 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0","Type":"ContainerDied","Data":"33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca"} Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.715307 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0","Type":"ContainerDied","Data":"54d5c46617d43798492c30bc9fc62a258eee5533e4781d5bfe52a0e8b2bb9a1f"} Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.715332 4773 scope.go:117] "RemoveContainer" containerID="33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.753888 4773 scope.go:117] "RemoveContainer" containerID="a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.772284 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.772269051 podStartE2EDuration="2.772269051s" podCreationTimestamp="2025-10-12 20:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:43:35.765139902 +0000 UTC m=+1164.001438462" watchObservedRunningTime="2025-10-12 20:43:35.772269051 +0000 UTC m=+1164.008567601" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.788661 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j884p\" (UniqueName: \"kubernetes.io/projected/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-kube-api-access-j884p\") pod \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.788844 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-nova-metadata-tls-certs\") pod \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.788909 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-config-data\") pod \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.788979 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-logs\") pod \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.790910 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-logs" (OuterVolumeSpecName: "logs") pod "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" (UID: "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.800164 4773 scope.go:117] "RemoveContainer" containerID="33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.804511 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-kube-api-access-j884p" (OuterVolumeSpecName: "kube-api-access-j884p") pod "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" (UID: "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0"). InnerVolumeSpecName "kube-api-access-j884p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:43:35 crc kubenswrapper[4773]: E1012 20:43:35.810887 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca\": container with ID starting with 33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca not found: ID does not exist" containerID="33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.810927 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca"} err="failed to get container status \"33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca\": rpc error: code = NotFound desc = could not find container \"33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca\": container with ID starting with 33faca352fd19dcdc146c9266d8d41cbba3e6418c44fa6ecfcf5dba2bc12caca not found: ID does not exist" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.810971 4773 scope.go:117] "RemoveContainer" containerID="a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060" Oct 12 20:43:35 crc kubenswrapper[4773]: E1012 20:43:35.811228 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060\": container with ID starting with a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060 not found: ID does not exist" containerID="a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.811251 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060"} err="failed to get container status \"a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060\": rpc error: code = NotFound desc = could not find container \"a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060\": container with ID starting with a5da9f436d016a7ea422b6d2a2daaaad0bffdcc719e5a1925a1a079dc4543060 not found: ID does not exist" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.856544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-config-data" (OuterVolumeSpecName: "config-data") pod "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" (UID: "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.865930 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" (UID: "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.890494 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-combined-ca-bundle\") pod \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\" (UID: \"83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0\") " Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.890793 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-logs\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.890812 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j884p\" (UniqueName: \"kubernetes.io/projected/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-kube-api-access-j884p\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.890823 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.890832 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.913963 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" (UID: "83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:35 crc kubenswrapper[4773]: I1012 20:43:35.992306 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.728822 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.775743 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.796266 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.803337 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:43:36 crc kubenswrapper[4773]: E1012 20:43:36.803774 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-log" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.803789 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-log" Oct 12 20:43:36 crc kubenswrapper[4773]: E1012 20:43:36.803810 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-metadata" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.803816 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-metadata" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.804000 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-log" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.804011 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" containerName="nova-metadata-metadata" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.804959 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.809203 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.810999 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.815410 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.907981 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74ec9771-6918-4102-abd1-7b9130f91a4d-logs\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.908038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ec9771-6918-4102-abd1-7b9130f91a4d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.908158 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj7p7\" (UniqueName: \"kubernetes.io/projected/74ec9771-6918-4102-abd1-7b9130f91a4d-kube-api-access-sj7p7\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.908173 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74ec9771-6918-4102-abd1-7b9130f91a4d-config-data\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:36 crc kubenswrapper[4773]: I1012 20:43:36.908232 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ec9771-6918-4102-abd1-7b9130f91a4d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.009918 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ec9771-6918-4102-abd1-7b9130f91a4d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.010328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj7p7\" (UniqueName: \"kubernetes.io/projected/74ec9771-6918-4102-abd1-7b9130f91a4d-kube-api-access-sj7p7\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.010352 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74ec9771-6918-4102-abd1-7b9130f91a4d-config-data\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.010422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ec9771-6918-4102-abd1-7b9130f91a4d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.010451 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74ec9771-6918-4102-abd1-7b9130f91a4d-logs\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.010835 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74ec9771-6918-4102-abd1-7b9130f91a4d-logs\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.015329 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74ec9771-6918-4102-abd1-7b9130f91a4d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.015800 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74ec9771-6918-4102-abd1-7b9130f91a4d-config-data\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.020908 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/74ec9771-6918-4102-abd1-7b9130f91a4d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.027614 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj7p7\" (UniqueName: \"kubernetes.io/projected/74ec9771-6918-4102-abd1-7b9130f91a4d-kube-api-access-sj7p7\") pod \"nova-metadata-0\" (UID: \"74ec9771-6918-4102-abd1-7b9130f91a4d\") " pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.133667 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.595224 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.688133 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.729521 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrpzm\" (UniqueName: \"kubernetes.io/projected/e2f29efb-8a93-4806-a572-03ec784612a9-kube-api-access-mrpzm\") pod \"e2f29efb-8a93-4806-a572-03ec784612a9\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.729569 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-internal-tls-certs\") pod \"e2f29efb-8a93-4806-a572-03ec784612a9\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.729726 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-combined-ca-bundle\") pod \"e2f29efb-8a93-4806-a572-03ec784612a9\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.729747 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f29efb-8a93-4806-a572-03ec784612a9-logs\") pod \"e2f29efb-8a93-4806-a572-03ec784612a9\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.729796 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-config-data\") pod \"e2f29efb-8a93-4806-a572-03ec784612a9\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.729884 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-public-tls-certs\") pod \"e2f29efb-8a93-4806-a572-03ec784612a9\" (UID: \"e2f29efb-8a93-4806-a572-03ec784612a9\") " Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.730952 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f29efb-8a93-4806-a572-03ec784612a9-logs" (OuterVolumeSpecName: "logs") pod "e2f29efb-8a93-4806-a572-03ec784612a9" (UID: "e2f29efb-8a93-4806-a572-03ec784612a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.735910 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f29efb-8a93-4806-a572-03ec784612a9-kube-api-access-mrpzm" (OuterVolumeSpecName: "kube-api-access-mrpzm") pod "e2f29efb-8a93-4806-a572-03ec784612a9" (UID: "e2f29efb-8a93-4806-a572-03ec784612a9"). InnerVolumeSpecName "kube-api-access-mrpzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.737249 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74ec9771-6918-4102-abd1-7b9130f91a4d","Type":"ContainerStarted","Data":"680822f2996c4f59a3f2c2999ba01d946356c10cb54157805bd5b551d30da24e"} Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.739423 4773 generic.go:334] "Generic (PLEG): container finished" podID="e2f29efb-8a93-4806-a572-03ec784612a9" containerID="54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180" exitCode=0 Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.739449 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2f29efb-8a93-4806-a572-03ec784612a9","Type":"ContainerDied","Data":"54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180"} Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.739465 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2f29efb-8a93-4806-a572-03ec784612a9","Type":"ContainerDied","Data":"e889a542126e0fdfbf5596ca90bad77d7f7391b2db555fbd64e0d8d61eb51609"} Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.739484 4773 scope.go:117] "RemoveContainer" containerID="54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.739579 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.764661 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-config-data" (OuterVolumeSpecName: "config-data") pod "e2f29efb-8a93-4806-a572-03ec784612a9" (UID: "e2f29efb-8a93-4806-a572-03ec784612a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.772782 4773 scope.go:117] "RemoveContainer" containerID="d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.794013 4773 scope.go:117] "RemoveContainer" containerID="54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180" Oct 12 20:43:37 crc kubenswrapper[4773]: E1012 20:43:37.794516 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180\": container with ID starting with 54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180 not found: ID does not exist" containerID="54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.794562 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180"} err="failed to get container status \"54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180\": rpc error: code = NotFound desc = could not find container \"54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180\": container with ID starting with 54c28c55cc88c6fe7bbbe14d2bae1257c106f9bf8b5e8ed95105c4109c7b5180 not found: ID does not exist" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.794596 4773 scope.go:117] "RemoveContainer" containerID="d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8" Oct 12 20:43:37 crc kubenswrapper[4773]: E1012 20:43:37.794984 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8\": container with ID starting with d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8 not found: ID does not exist" containerID="d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.795033 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8"} err="failed to get container status \"d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8\": rpc error: code = NotFound desc = could not find container \"d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8\": container with ID starting with d7d6b5b232281d80d34dc72fd6f46d3b743607ddd86bd3dd540ca5089c69f2c8 not found: ID does not exist" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.800584 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2f29efb-8a93-4806-a572-03ec784612a9" (UID: "e2f29efb-8a93-4806-a572-03ec784612a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.800660 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e2f29efb-8a93-4806-a572-03ec784612a9" (UID: "e2f29efb-8a93-4806-a572-03ec784612a9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.824860 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e2f29efb-8a93-4806-a572-03ec784612a9" (UID: "e2f29efb-8a93-4806-a572-03ec784612a9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.832094 4773 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.832129 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrpzm\" (UniqueName: \"kubernetes.io/projected/e2f29efb-8a93-4806-a572-03ec784612a9-kube-api-access-mrpzm\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.832139 4773 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.832151 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.832163 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f29efb-8a93-4806-a572-03ec784612a9-logs\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:37 crc kubenswrapper[4773]: I1012 20:43:37.832172 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f29efb-8a93-4806-a572-03ec784612a9-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.072760 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.084855 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.108634 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:38 crc kubenswrapper[4773]: E1012 20:43:38.109525 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-log" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.109668 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-log" Oct 12 20:43:38 crc kubenswrapper[4773]: E1012 20:43:38.109832 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-api" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.109941 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-api" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.110320 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-api" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.110460 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" containerName="nova-api-log" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.112207 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.116031 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.117584 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.118008 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.135249 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.139641 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.139680 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.139802 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-public-tls-certs\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.139835 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8163c65f-b48b-4fd5-b7c1-12d94abfa723-logs\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.140896 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq6cp\" (UniqueName: \"kubernetes.io/projected/8163c65f-b48b-4fd5-b7c1-12d94abfa723-kube-api-access-qq6cp\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.140990 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-config-data\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.242588 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8163c65f-b48b-4fd5-b7c1-12d94abfa723-logs\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.242859 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq6cp\" (UniqueName: \"kubernetes.io/projected/8163c65f-b48b-4fd5-b7c1-12d94abfa723-kube-api-access-qq6cp\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.243032 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8163c65f-b48b-4fd5-b7c1-12d94abfa723-logs\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.243107 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-config-data\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.243233 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.243300 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.243432 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-public-tls-certs\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.246465 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.246492 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-config-data\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.246880 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.258267 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8163c65f-b48b-4fd5-b7c1-12d94abfa723-public-tls-certs\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.260875 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq6cp\" (UniqueName: \"kubernetes.io/projected/8163c65f-b48b-4fd5-b7c1-12d94abfa723-kube-api-access-qq6cp\") pod \"nova-api-0\" (UID: \"8163c65f-b48b-4fd5-b7c1-12d94abfa723\") " pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.434553 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.521041 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0" path="/var/lib/kubelet/pods/83aa9f8d-2429-4baa-b4d3-eba4e7f4edd0/volumes" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.526916 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f29efb-8a93-4806-a572-03ec784612a9" path="/var/lib/kubelet/pods/e2f29efb-8a93-4806-a572-03ec784612a9/volumes" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.753690 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74ec9771-6918-4102-abd1-7b9130f91a4d","Type":"ContainerStarted","Data":"2f73f1fea55f36acd831af92bbfba5234e550f515cba068180ca384a73d48c34"} Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.753766 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74ec9771-6918-4102-abd1-7b9130f91a4d","Type":"ContainerStarted","Data":"ca7a50a1b48da3ca5f7b791f3ed4e9bab49609ba05dd91386f914d4fcfae9397"} Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.777176 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.7771567 podStartE2EDuration="2.7771567s" podCreationTimestamp="2025-10-12 20:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:43:38.775319159 +0000 UTC m=+1167.011617729" watchObservedRunningTime="2025-10-12 20:43:38.7771567 +0000 UTC m=+1167.013455260" Oct 12 20:43:38 crc kubenswrapper[4773]: I1012 20:43:38.931028 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 20:43:38 crc kubenswrapper[4773]: W1012 20:43:38.933669 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8163c65f_b48b_4fd5_b7c1_12d94abfa723.slice/crio-97b6abb5d21985a82731bf6576e2bc3a4289de0e21fdace4015a7af138e59f84 WatchSource:0}: Error finding container 97b6abb5d21985a82731bf6576e2bc3a4289de0e21fdace4015a7af138e59f84: Status 404 returned error can't find the container with id 97b6abb5d21985a82731bf6576e2bc3a4289de0e21fdace4015a7af138e59f84 Oct 12 20:43:39 crc kubenswrapper[4773]: I1012 20:43:39.101178 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 12 20:43:39 crc kubenswrapper[4773]: I1012 20:43:39.777152 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8163c65f-b48b-4fd5-b7c1-12d94abfa723","Type":"ContainerStarted","Data":"581aa1c928beb1745b03af48b27d92dfeb6f8ad1cd14d2206e9639a19201133f"} Oct 12 20:43:39 crc kubenswrapper[4773]: I1012 20:43:39.777418 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8163c65f-b48b-4fd5-b7c1-12d94abfa723","Type":"ContainerStarted","Data":"11e1dbdf5c58d80b04d8bc4833e770527fd6da2c2d9503b883a1377f494ee58f"} Oct 12 20:43:39 crc kubenswrapper[4773]: I1012 20:43:39.777430 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8163c65f-b48b-4fd5-b7c1-12d94abfa723","Type":"ContainerStarted","Data":"97b6abb5d21985a82731bf6576e2bc3a4289de0e21fdace4015a7af138e59f84"} Oct 12 20:43:39 crc kubenswrapper[4773]: I1012 20:43:39.800378 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.800350145 podStartE2EDuration="1.800350145s" podCreationTimestamp="2025-10-12 20:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:43:39.798337538 +0000 UTC m=+1168.034636118" watchObservedRunningTime="2025-10-12 20:43:39.800350145 +0000 UTC m=+1168.036648725" Oct 12 20:43:42 crc kubenswrapper[4773]: I1012 20:43:42.134248 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 20:43:42 crc kubenswrapper[4773]: I1012 20:43:42.134527 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 20:43:44 crc kubenswrapper[4773]: I1012 20:43:44.101575 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 12 20:43:44 crc kubenswrapper[4773]: I1012 20:43:44.136759 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 12 20:43:44 crc kubenswrapper[4773]: I1012 20:43:44.865123 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 12 20:43:47 crc kubenswrapper[4773]: I1012 20:43:47.134955 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 20:43:47 crc kubenswrapper[4773]: I1012 20:43:47.135280 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 20:43:48 crc kubenswrapper[4773]: I1012 20:43:48.148960 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="74ec9771-6918-4102-abd1-7b9130f91a4d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 20:43:48 crc kubenswrapper[4773]: I1012 20:43:48.148971 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="74ec9771-6918-4102-abd1-7b9130f91a4d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 20:43:48 crc kubenswrapper[4773]: I1012 20:43:48.434820 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 20:43:48 crc kubenswrapper[4773]: I1012 20:43:48.434885 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 20:43:49 crc kubenswrapper[4773]: I1012 20:43:49.447852 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8163c65f-b48b-4fd5-b7c1-12d94abfa723" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.188:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 20:43:49 crc kubenswrapper[4773]: I1012 20:43:49.447905 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8163c65f-b48b-4fd5-b7c1-12d94abfa723" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.188:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 20:43:51 crc kubenswrapper[4773]: I1012 20:43:51.898230 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 12 20:43:57 crc kubenswrapper[4773]: I1012 20:43:57.141995 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 20:43:57 crc kubenswrapper[4773]: I1012 20:43:57.146623 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 20:43:57 crc kubenswrapper[4773]: I1012 20:43:57.156406 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 20:43:57 crc kubenswrapper[4773]: I1012 20:43:57.961893 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 20:43:58 crc kubenswrapper[4773]: I1012 20:43:58.440695 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 20:43:58 crc kubenswrapper[4773]: I1012 20:43:58.442076 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 20:43:58 crc kubenswrapper[4773]: I1012 20:43:58.442165 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 20:43:58 crc kubenswrapper[4773]: I1012 20:43:58.448114 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 20:43:58 crc kubenswrapper[4773]: I1012 20:43:58.967196 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 20:43:58 crc kubenswrapper[4773]: I1012 20:43:58.976661 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 20:44:06 crc kubenswrapper[4773]: I1012 20:44:06.550947 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 20:44:07 crc kubenswrapper[4773]: I1012 20:44:07.722658 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 20:44:10 crc kubenswrapper[4773]: I1012 20:44:10.791528 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" containerName="rabbitmq" containerID="cri-o://a6909d6d8e160f1ec003b37dba4779ea8176a13b2213803ed9a6ad4c2f6c1825" gracePeriod=604796 Oct 12 20:44:11 crc kubenswrapper[4773]: I1012 20:44:11.870081 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" containerName="rabbitmq" containerID="cri-o://33d77ea1ef4c3a341a14af56a3ff85779969d2a2e4eb8023361463bdd0c31c51" gracePeriod=604796 Oct 12 20:44:16 crc kubenswrapper[4773]: I1012 20:44:16.368067 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 12 20:44:16 crc kubenswrapper[4773]: I1012 20:44:16.994421 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.116242 4773 generic.go:334] "Generic (PLEG): container finished" podID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" containerID="a6909d6d8e160f1ec003b37dba4779ea8176a13b2213803ed9a6ad4c2f6c1825" exitCode=0 Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.116283 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3","Type":"ContainerDied","Data":"a6909d6d8e160f1ec003b37dba4779ea8176a13b2213803ed9a6ad4c2f6c1825"} Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.305184 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.385455 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-pod-info\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.385797 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-erlang-cookie-secret\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.385923 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc8ll\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-kube-api-access-xc8ll\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.385946 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-confd\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.386384 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-erlang-cookie\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.386427 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-server-conf\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.386478 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-config-data\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.386510 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-plugins-conf\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.386540 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-tls\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.386573 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-plugins\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.386625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\" (UID: \"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3\") " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.386890 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.387211 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.387661 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.387681 4773 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.389020 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.397802 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-kube-api-access-xc8ll" (OuterVolumeSpecName: "kube-api-access-xc8ll") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "kube-api-access-xc8ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.398395 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.401024 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-pod-info" (OuterVolumeSpecName: "pod-info") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.408712 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.411287 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.424968 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-config-data" (OuterVolumeSpecName: "config-data") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.465670 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-server-conf" (OuterVolumeSpecName: "server-conf") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.491622 4773 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.491654 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc8ll\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-kube-api-access-xc8ll\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.491665 4773 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-server-conf\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.491673 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.491681 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.491689 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.491793 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.491805 4773 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-pod-info\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.518303 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.549479 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" (UID: "f8ee60c2-5884-4c71-9b5f-ecc15b9663a3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.592881 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:17 crc kubenswrapper[4773]: I1012 20:44:17.592910 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.130443 4773 generic.go:334] "Generic (PLEG): container finished" podID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" containerID="33d77ea1ef4c3a341a14af56a3ff85779969d2a2e4eb8023361463bdd0c31c51" exitCode=0 Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.130688 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36de8afd-4afa-44e6-9d8e-a6c8de0d4707","Type":"ContainerDied","Data":"33d77ea1ef4c3a341a14af56a3ff85779969d2a2e4eb8023361463bdd0c31c51"} Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.137612 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f8ee60c2-5884-4c71-9b5f-ecc15b9663a3","Type":"ContainerDied","Data":"421148360c5536826f42d7d029b7c9e4756c7362409615daf033eaff55530534"} Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.137662 4773 scope.go:117] "RemoveContainer" containerID="a6909d6d8e160f1ec003b37dba4779ea8176a13b2213803ed9a6ad4c2f6c1825" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.137688 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.173075 4773 scope.go:117] "RemoveContainer" containerID="c084c720b443d756cc911fdaf91f39912d929d0f6e679aeabcc4afc8d9674e4c" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.180024 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.189524 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.233691 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 20:44:18 crc kubenswrapper[4773]: E1012 20:44:18.234173 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" containerName="setup-container" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.234196 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" containerName="setup-container" Oct 12 20:44:18 crc kubenswrapper[4773]: E1012 20:44:18.234209 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" containerName="rabbitmq" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.234218 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" containerName="rabbitmq" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.234442 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" containerName="rabbitmq" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.235360 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.238155 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.238349 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.238649 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.238851 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-crqm6" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.239096 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.242077 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.243122 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.259906 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0b0fae69-d926-472c-a222-3a98f25a1e14-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306512 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306534 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0b0fae69-d926-472c-a222-3a98f25a1e14-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306584 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306622 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b0fae69-d926-472c-a222-3a98f25a1e14-config-data\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306648 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0b0fae69-d926-472c-a222-3a98f25a1e14-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306686 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx7jf\" (UniqueName: \"kubernetes.io/projected/0b0fae69-d926-472c-a222-3a98f25a1e14-kube-api-access-cx7jf\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306732 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306749 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0b0fae69-d926-472c-a222-3a98f25a1e14-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.306768 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408097 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx7jf\" (UniqueName: \"kubernetes.io/projected/0b0fae69-d926-472c-a222-3a98f25a1e14-kube-api-access-cx7jf\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0b0fae69-d926-472c-a222-3a98f25a1e14-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408197 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408222 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408237 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0b0fae69-d926-472c-a222-3a98f25a1e14-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408257 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408281 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0b0fae69-d926-472c-a222-3a98f25a1e14-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408332 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408368 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b0fae69-d926-472c-a222-3a98f25a1e14-config-data\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408394 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0b0fae69-d926-472c-a222-3a98f25a1e14-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.408761 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.411458 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.412311 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0b0fae69-d926-472c-a222-3a98f25a1e14-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.419362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0b0fae69-d926-472c-a222-3a98f25a1e14-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.422429 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0b0fae69-d926-472c-a222-3a98f25a1e14-config-data\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.423607 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.426754 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.426835 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0b0fae69-d926-472c-a222-3a98f25a1e14-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.430375 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0b0fae69-d926-472c-a222-3a98f25a1e14-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.435371 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx7jf\" (UniqueName: \"kubernetes.io/projected/0b0fae69-d926-472c-a222-3a98f25a1e14-kube-api-access-cx7jf\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.445868 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0b0fae69-d926-472c-a222-3a98f25a1e14-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.492271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"0b0fae69-d926-472c-a222-3a98f25a1e14\") " pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.494025 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8ee60c2-5884-4c71-9b5f-ecc15b9663a3" path="/var/lib/kubelet/pods/f8ee60c2-5884-4c71-9b5f-ecc15b9663a3/volumes" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.523029 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.609557 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.619617 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-tls\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.619706 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-config-data\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.619805 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-plugins\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.619867 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-plugins-conf\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.619919 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-confd\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.619958 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-erlang-cookie\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.620014 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.620120 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcr9t\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-kube-api-access-wcr9t\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.620204 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-erlang-cookie-secret\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.620233 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-server-conf\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.620278 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-pod-info\") pod \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\" (UID: \"36de8afd-4afa-44e6-9d8e-a6c8de0d4707\") " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.621259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.621643 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.623582 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.625703 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.625849 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.635647 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-pod-info" (OuterVolumeSpecName: "pod-info") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.641929 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.651501 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-kube-api-access-wcr9t" (OuterVolumeSpecName: "kube-api-access-wcr9t") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "kube-api-access-wcr9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.667098 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-config-data" (OuterVolumeSpecName: "config-data") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.682312 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-server-conf" (OuterVolumeSpecName: "server-conf") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721300 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721344 4773 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721357 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721384 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721393 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcr9t\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-kube-api-access-wcr9t\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721402 4773 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721410 4773 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-server-conf\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721421 4773 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-pod-info\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721432 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.721442 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.750066 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.760071 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "36de8afd-4afa-44e6-9d8e-a6c8de0d4707" (UID: "36de8afd-4afa-44e6-9d8e-a6c8de0d4707"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.823160 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:18 crc kubenswrapper[4773]: I1012 20:44:18.823186 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/36de8afd-4afa-44e6-9d8e-a6c8de0d4707-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.096679 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.149467 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0b0fae69-d926-472c-a222-3a98f25a1e14","Type":"ContainerStarted","Data":"12af5bbc016cd2d8bbb376220786eb4e370d3084c937c049506120b8f49fd714"} Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.151606 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36de8afd-4afa-44e6-9d8e-a6c8de0d4707","Type":"ContainerDied","Data":"e75f2c899b3685bdbff3909de10f8480e11609fecef89b813dc645936daa8835"} Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.151628 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.151652 4773 scope.go:117] "RemoveContainer" containerID="33d77ea1ef4c3a341a14af56a3ff85779969d2a2e4eb8023361463bdd0c31c51" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.224896 4773 scope.go:117] "RemoveContainer" containerID="ff7ab913cf70bfc7acc2d49ac16a439d42227137b3387a23020f1913b05254dc" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.278060 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.303926 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.312182 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 20:44:19 crc kubenswrapper[4773]: E1012 20:44:19.312591 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" containerName="rabbitmq" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.312605 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" containerName="rabbitmq" Oct 12 20:44:19 crc kubenswrapper[4773]: E1012 20:44:19.312624 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" containerName="setup-container" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.312631 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" containerName="setup-container" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.312821 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" containerName="rabbitmq" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.313770 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.319276 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.325198 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.325333 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.325542 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.325601 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-x5n8n" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.325769 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.325283 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.325568 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435665 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435733 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435772 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435797 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435823 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435842 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435861 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435885 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435920 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjll7\" (UniqueName: \"kubernetes.io/projected/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-kube-api-access-sjll7\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435943 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.435957 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537504 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537560 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537601 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537622 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537666 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537686 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537712 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537761 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjll7\" (UniqueName: \"kubernetes.io/projected/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-kube-api-access-sjll7\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537786 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.537803 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.538554 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.538859 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.539450 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.539969 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.540085 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.540316 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.541264 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.542423 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.542692 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.544634 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.554483 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjll7\" (UniqueName: \"kubernetes.io/projected/3c6bb2e3-2f0e-499a-b349-07ea3eb7190d-kube-api-access-sjll7\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.565270 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:19 crc kubenswrapper[4773]: I1012 20:44:19.644516 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:20 crc kubenswrapper[4773]: W1012 20:44:20.101002 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c6bb2e3_2f0e_499a_b349_07ea3eb7190d.slice/crio-a6fad9e750a87bfa6ec6ce6ec6f6b0798eef9b95dec989becf5003c306524aea WatchSource:0}: Error finding container a6fad9e750a87bfa6ec6ce6ec6f6b0798eef9b95dec989becf5003c306524aea: Status 404 returned error can't find the container with id a6fad9e750a87bfa6ec6ce6ec6f6b0798eef9b95dec989becf5003c306524aea Oct 12 20:44:20 crc kubenswrapper[4773]: I1012 20:44:20.101116 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 20:44:20 crc kubenswrapper[4773]: I1012 20:44:20.164391 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d","Type":"ContainerStarted","Data":"a6fad9e750a87bfa6ec6ce6ec6f6b0798eef9b95dec989becf5003c306524aea"} Oct 12 20:44:20 crc kubenswrapper[4773]: I1012 20:44:20.491872 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36de8afd-4afa-44e6-9d8e-a6c8de0d4707" path="/var/lib/kubelet/pods/36de8afd-4afa-44e6-9d8e-a6c8de0d4707/volumes" Oct 12 20:44:21 crc kubenswrapper[4773]: I1012 20:44:21.177280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0b0fae69-d926-472c-a222-3a98f25a1e14","Type":"ContainerStarted","Data":"c6653c402032ae5b132e25cedc40d1727a0890a46ffadcbefbbc69278c900db9"} Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.196164 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d","Type":"ContainerStarted","Data":"122610d7a2bf2d91956a5a356cce19b72bff51bc81320ac9da00facb0e1f1c59"} Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.801532 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64fb5d8fd7-fcnn8"] Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.803462 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.806372 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.826047 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64fb5d8fd7-fcnn8"] Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.899512 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-nb\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.899576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-sb\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.899609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmfxs\" (UniqueName: \"kubernetes.io/projected/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-kube-api-access-qmfxs\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.899929 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-dns-svc\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.900019 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-config\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:22 crc kubenswrapper[4773]: I1012 20:44:22.900090 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-openstack-edpm-ipam\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.001596 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-config\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.001656 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-openstack-edpm-ipam\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.001726 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-nb\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.001771 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-sb\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.001800 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmfxs\" (UniqueName: \"kubernetes.io/projected/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-kube-api-access-qmfxs\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.001863 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-dns-svc\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.002618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-nb\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.002667 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-dns-svc\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.002836 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-sb\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.002855 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-openstack-edpm-ipam\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.003020 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-config\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.028930 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmfxs\" (UniqueName: \"kubernetes.io/projected/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-kube-api-access-qmfxs\") pod \"dnsmasq-dns-64fb5d8fd7-fcnn8\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.132557 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:23 crc kubenswrapper[4773]: W1012 20:44:23.603954 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5e39b6c_1aac_4a24_abe1_080e5b16ac62.slice/crio-58708dceca3218ef22ea97438bb9d90f0e1e0c430c90078d715afc5758b9d6cd WatchSource:0}: Error finding container 58708dceca3218ef22ea97438bb9d90f0e1e0c430c90078d715afc5758b9d6cd: Status 404 returned error can't find the container with id 58708dceca3218ef22ea97438bb9d90f0e1e0c430c90078d715afc5758b9d6cd Oct 12 20:44:23 crc kubenswrapper[4773]: I1012 20:44:23.609405 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64fb5d8fd7-fcnn8"] Oct 12 20:44:24 crc kubenswrapper[4773]: I1012 20:44:24.214970 4773 generic.go:334] "Generic (PLEG): container finished" podID="c5e39b6c-1aac-4a24-abe1-080e5b16ac62" containerID="60192c38678800c274ff664a209cd0f2ec65a9baf116acbb1ef29bb8cf676edd" exitCode=0 Oct 12 20:44:24 crc kubenswrapper[4773]: I1012 20:44:24.215017 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" event={"ID":"c5e39b6c-1aac-4a24-abe1-080e5b16ac62","Type":"ContainerDied","Data":"60192c38678800c274ff664a209cd0f2ec65a9baf116acbb1ef29bb8cf676edd"} Oct 12 20:44:24 crc kubenswrapper[4773]: I1012 20:44:24.215047 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" event={"ID":"c5e39b6c-1aac-4a24-abe1-080e5b16ac62","Type":"ContainerStarted","Data":"58708dceca3218ef22ea97438bb9d90f0e1e0c430c90078d715afc5758b9d6cd"} Oct 12 20:44:25 crc kubenswrapper[4773]: I1012 20:44:25.227340 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" event={"ID":"c5e39b6c-1aac-4a24-abe1-080e5b16ac62","Type":"ContainerStarted","Data":"457e2241ddfcec252073d81364ef29b9e14d030230ba00ec25c9f3804ce229f6"} Oct 12 20:44:25 crc kubenswrapper[4773]: I1012 20:44:25.228699 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:25 crc kubenswrapper[4773]: I1012 20:44:25.255415 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" podStartSLOduration=3.255390002 podStartE2EDuration="3.255390002s" podCreationTimestamp="2025-10-12 20:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:44:25.245807442 +0000 UTC m=+1213.482106033" watchObservedRunningTime="2025-10-12 20:44:25.255390002 +0000 UTC m=+1213.491688602" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.133941 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.212196 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-665946c669-gtbdj"] Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.212415 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-665946c669-gtbdj" podUID="526df519-a931-4d53-b729-3256ced8c813" containerName="dnsmasq-dns" containerID="cri-o://a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32" gracePeriod=10 Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.416727 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-867c8fd5c5-t9wtm"] Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.418471 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.438885 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-867c8fd5c5-t9wtm"] Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.528041 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-sb\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.528099 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-dns-svc\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.528138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfpn8\" (UniqueName: \"kubernetes.io/projected/2ec59ef6-fa6e-457f-9042-f77dfa673dde-kube-api-access-zfpn8\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.528171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-openstack-edpm-ipam\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.528233 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-nb\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.528269 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-config\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.629695 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-config\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.629799 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-sb\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.629851 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-dns-svc\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.629906 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfpn8\" (UniqueName: \"kubernetes.io/projected/2ec59ef6-fa6e-457f-9042-f77dfa673dde-kube-api-access-zfpn8\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.629959 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-openstack-edpm-ipam\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.630063 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-nb\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.630527 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-config\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.631633 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-openstack-edpm-ipam\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.631812 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-sb\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.632375 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-dns-svc\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.632959 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-nb\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.650909 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfpn8\" (UniqueName: \"kubernetes.io/projected/2ec59ef6-fa6e-457f-9042-f77dfa673dde-kube-api-access-zfpn8\") pod \"dnsmasq-dns-867c8fd5c5-t9wtm\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.730895 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.734253 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.834564 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-dns-svc\") pod \"526df519-a931-4d53-b729-3256ced8c813\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.835430 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkjd4\" (UniqueName: \"kubernetes.io/projected/526df519-a931-4d53-b729-3256ced8c813-kube-api-access-mkjd4\") pod \"526df519-a931-4d53-b729-3256ced8c813\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.836150 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-sb\") pod \"526df519-a931-4d53-b729-3256ced8c813\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.836243 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-config\") pod \"526df519-a931-4d53-b729-3256ced8c813\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.836317 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-nb\") pod \"526df519-a931-4d53-b729-3256ced8c813\" (UID: \"526df519-a931-4d53-b729-3256ced8c813\") " Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.843242 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/526df519-a931-4d53-b729-3256ced8c813-kube-api-access-mkjd4" (OuterVolumeSpecName: "kube-api-access-mkjd4") pod "526df519-a931-4d53-b729-3256ced8c813" (UID: "526df519-a931-4d53-b729-3256ced8c813"). InnerVolumeSpecName "kube-api-access-mkjd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.912751 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-config" (OuterVolumeSpecName: "config") pod "526df519-a931-4d53-b729-3256ced8c813" (UID: "526df519-a931-4d53-b729-3256ced8c813"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.925742 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "526df519-a931-4d53-b729-3256ced8c813" (UID: "526df519-a931-4d53-b729-3256ced8c813"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.929689 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "526df519-a931-4d53-b729-3256ced8c813" (UID: "526df519-a931-4d53-b729-3256ced8c813"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.938969 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkjd4\" (UniqueName: \"kubernetes.io/projected/526df519-a931-4d53-b729-3256ced8c813-kube-api-access-mkjd4\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.939000 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.939013 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.939024 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:33 crc kubenswrapper[4773]: I1012 20:44:33.941330 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "526df519-a931-4d53-b729-3256ced8c813" (UID: "526df519-a931-4d53-b729-3256ced8c813"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.040153 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/526df519-a931-4d53-b729-3256ced8c813-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.265560 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-867c8fd5c5-t9wtm"] Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.357816 4773 generic.go:334] "Generic (PLEG): container finished" podID="526df519-a931-4d53-b729-3256ced8c813" containerID="a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32" exitCode=0 Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.357892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665946c669-gtbdj" event={"ID":"526df519-a931-4d53-b729-3256ced8c813","Type":"ContainerDied","Data":"a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32"} Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.357926 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665946c669-gtbdj" event={"ID":"526df519-a931-4d53-b729-3256ced8c813","Type":"ContainerDied","Data":"2ff8ea2c34f8916e884848f261b439dad0a5f2e1e475965ef1cd5e464d0578a2"} Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.357947 4773 scope.go:117] "RemoveContainer" containerID="a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.358092 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665946c669-gtbdj" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.371104 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" event={"ID":"2ec59ef6-fa6e-457f-9042-f77dfa673dde","Type":"ContainerStarted","Data":"ee026c72e6f4121a855b1555adc920ccd0547441cdd039fb63ab8a2b125aa6f3"} Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.444317 4773 scope.go:117] "RemoveContainer" containerID="ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.467732 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-665946c669-gtbdj"] Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.471371 4773 scope.go:117] "RemoveContainer" containerID="a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32" Oct 12 20:44:34 crc kubenswrapper[4773]: E1012 20:44:34.471799 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32\": container with ID starting with a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32 not found: ID does not exist" containerID="a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.471826 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32"} err="failed to get container status \"a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32\": rpc error: code = NotFound desc = could not find container \"a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32\": container with ID starting with a22cabd9ee09695d9c35c0d422bd614e68e04e8e0151f8f8ec87a44f4aed4b32 not found: ID does not exist" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.471847 4773 scope.go:117] "RemoveContainer" containerID="ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c" Oct 12 20:44:34 crc kubenswrapper[4773]: E1012 20:44:34.472694 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c\": container with ID starting with ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c not found: ID does not exist" containerID="ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.472737 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c"} err="failed to get container status \"ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c\": rpc error: code = NotFound desc = could not find container \"ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c\": container with ID starting with ecebe46d1519253e60bf8d1a78127fab9ee85fe9c12284a501f1fd6d3bb0915c not found: ID does not exist" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.477898 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-665946c669-gtbdj"] Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.493324 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="526df519-a931-4d53-b729-3256ced8c813" path="/var/lib/kubelet/pods/526df519-a931-4d53-b729-3256ced8c813/volumes" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.537924 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx"] Oct 12 20:44:34 crc kubenswrapper[4773]: E1012 20:44:34.538276 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526df519-a931-4d53-b729-3256ced8c813" containerName="dnsmasq-dns" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.538293 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="526df519-a931-4d53-b729-3256ced8c813" containerName="dnsmasq-dns" Oct 12 20:44:34 crc kubenswrapper[4773]: E1012 20:44:34.538326 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="526df519-a931-4d53-b729-3256ced8c813" containerName="init" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.538333 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="526df519-a931-4d53-b729-3256ced8c813" containerName="init" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.538502 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="526df519-a931-4d53-b729-3256ced8c813" containerName="dnsmasq-dns" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.539111 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.546249 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.546884 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.547099 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.547221 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.560189 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx"] Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.655174 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.655244 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.655359 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.655448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx8sw\" (UniqueName: \"kubernetes.io/projected/6247dcc5-2188-4112-b4b8-b53023878263-kube-api-access-bx8sw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.757497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.757557 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.757661 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.757710 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx8sw\" (UniqueName: \"kubernetes.io/projected/6247dcc5-2188-4112-b4b8-b53023878263-kube-api-access-bx8sw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.762443 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.771912 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.773562 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.776269 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx8sw\" (UniqueName: \"kubernetes.io/projected/6247dcc5-2188-4112-b4b8-b53023878263-kube-api-access-bx8sw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:34 crc kubenswrapper[4773]: I1012 20:44:34.854446 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:35 crc kubenswrapper[4773]: I1012 20:44:35.380517 4773 generic.go:334] "Generic (PLEG): container finished" podID="2ec59ef6-fa6e-457f-9042-f77dfa673dde" containerID="ac22d73968d727fd3afffa18c707f80741276767b4d9e94afe8b1fa53abe0858" exitCode=0 Oct 12 20:44:35 crc kubenswrapper[4773]: I1012 20:44:35.380872 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" event={"ID":"2ec59ef6-fa6e-457f-9042-f77dfa673dde","Type":"ContainerDied","Data":"ac22d73968d727fd3afffa18c707f80741276767b4d9e94afe8b1fa53abe0858"} Oct 12 20:44:35 crc kubenswrapper[4773]: I1012 20:44:35.612228 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx"] Oct 12 20:44:35 crc kubenswrapper[4773]: I1012 20:44:35.617213 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 20:44:36 crc kubenswrapper[4773]: I1012 20:44:36.389680 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" event={"ID":"6247dcc5-2188-4112-b4b8-b53023878263","Type":"ContainerStarted","Data":"e50d924af9c5cb37b608acf0111c9fcaba5a8a417d796a005e0fa6e213638aeb"} Oct 12 20:44:36 crc kubenswrapper[4773]: I1012 20:44:36.391970 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" event={"ID":"2ec59ef6-fa6e-457f-9042-f77dfa673dde","Type":"ContainerStarted","Data":"601481733d4fbe13c6bfb4d6f05339b018c18769d7ab14bfce96bc014b029e48"} Oct 12 20:44:36 crc kubenswrapper[4773]: I1012 20:44:36.392357 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:42 crc kubenswrapper[4773]: I1012 20:44:42.507064 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" podStartSLOduration=9.507046794 podStartE2EDuration="9.507046794s" podCreationTimestamp="2025-10-12 20:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:44:36.416409112 +0000 UTC m=+1224.652707692" watchObservedRunningTime="2025-10-12 20:44:42.507046794 +0000 UTC m=+1230.743345354" Oct 12 20:44:43 crc kubenswrapper[4773]: I1012 20:44:43.735906 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 20:44:43 crc kubenswrapper[4773]: I1012 20:44:43.842316 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64fb5d8fd7-fcnn8"] Oct 12 20:44:43 crc kubenswrapper[4773]: I1012 20:44:43.842565 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" podUID="c5e39b6c-1aac-4a24-abe1-080e5b16ac62" containerName="dnsmasq-dns" containerID="cri-o://457e2241ddfcec252073d81364ef29b9e14d030230ba00ec25c9f3804ce229f6" gracePeriod=10 Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.500486 4773 generic.go:334] "Generic (PLEG): container finished" podID="c5e39b6c-1aac-4a24-abe1-080e5b16ac62" containerID="457e2241ddfcec252073d81364ef29b9e14d030230ba00ec25c9f3804ce229f6" exitCode=0 Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.514835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" event={"ID":"c5e39b6c-1aac-4a24-abe1-080e5b16ac62","Type":"ContainerDied","Data":"457e2241ddfcec252073d81364ef29b9e14d030230ba00ec25c9f3804ce229f6"} Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.665324 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.766838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-dns-svc\") pod \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.767007 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-nb\") pod \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.767094 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmfxs\" (UniqueName: \"kubernetes.io/projected/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-kube-api-access-qmfxs\") pod \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.767115 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-config\") pod \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.767147 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-openstack-edpm-ipam\") pod \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.767215 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-sb\") pod \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\" (UID: \"c5e39b6c-1aac-4a24-abe1-080e5b16ac62\") " Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.771425 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-kube-api-access-qmfxs" (OuterVolumeSpecName: "kube-api-access-qmfxs") pod "c5e39b6c-1aac-4a24-abe1-080e5b16ac62" (UID: "c5e39b6c-1aac-4a24-abe1-080e5b16ac62"). InnerVolumeSpecName "kube-api-access-qmfxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.808884 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5e39b6c-1aac-4a24-abe1-080e5b16ac62" (UID: "c5e39b6c-1aac-4a24-abe1-080e5b16ac62"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.808933 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5e39b6c-1aac-4a24-abe1-080e5b16ac62" (UID: "c5e39b6c-1aac-4a24-abe1-080e5b16ac62"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.814191 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c5e39b6c-1aac-4a24-abe1-080e5b16ac62" (UID: "c5e39b6c-1aac-4a24-abe1-080e5b16ac62"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.822197 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5e39b6c-1aac-4a24-abe1-080e5b16ac62" (UID: "c5e39b6c-1aac-4a24-abe1-080e5b16ac62"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.826292 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-config" (OuterVolumeSpecName: "config") pod "c5e39b6c-1aac-4a24-abe1-080e5b16ac62" (UID: "c5e39b6c-1aac-4a24-abe1-080e5b16ac62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.869307 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.869352 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.869368 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmfxs\" (UniqueName: \"kubernetes.io/projected/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-kube-api-access-qmfxs\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.869381 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-config\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.869393 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:44 crc kubenswrapper[4773]: I1012 20:44:44.869403 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5e39b6c-1aac-4a24-abe1-080e5b16ac62-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:45 crc kubenswrapper[4773]: I1012 20:44:45.509772 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" event={"ID":"c5e39b6c-1aac-4a24-abe1-080e5b16ac62","Type":"ContainerDied","Data":"58708dceca3218ef22ea97438bb9d90f0e1e0c430c90078d715afc5758b9d6cd"} Oct 12 20:44:45 crc kubenswrapper[4773]: I1012 20:44:45.509819 4773 scope.go:117] "RemoveContainer" containerID="457e2241ddfcec252073d81364ef29b9e14d030230ba00ec25c9f3804ce229f6" Oct 12 20:44:45 crc kubenswrapper[4773]: I1012 20:44:45.509950 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fb5d8fd7-fcnn8" Oct 12 20:44:45 crc kubenswrapper[4773]: I1012 20:44:45.520950 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" event={"ID":"6247dcc5-2188-4112-b4b8-b53023878263","Type":"ContainerStarted","Data":"6fd9d00fd866419e79b85ba425c00b7a84997153f12f6c0c25b1b8d4de8c977c"} Oct 12 20:44:45 crc kubenswrapper[4773]: I1012 20:44:45.546515 4773 scope.go:117] "RemoveContainer" containerID="60192c38678800c274ff664a209cd0f2ec65a9baf116acbb1ef29bb8cf676edd" Oct 12 20:44:45 crc kubenswrapper[4773]: I1012 20:44:45.550475 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" podStartSLOduration=2.735388985 podStartE2EDuration="11.550462109s" podCreationTimestamp="2025-10-12 20:44:34 +0000 UTC" firstStartedPulling="2025-10-12 20:44:35.616998879 +0000 UTC m=+1223.853297439" lastFinishedPulling="2025-10-12 20:44:44.432072003 +0000 UTC m=+1232.668370563" observedRunningTime="2025-10-12 20:44:45.545758386 +0000 UTC m=+1233.782056946" watchObservedRunningTime="2025-10-12 20:44:45.550462109 +0000 UTC m=+1233.786760669" Oct 12 20:44:45 crc kubenswrapper[4773]: I1012 20:44:45.590906 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64fb5d8fd7-fcnn8"] Oct 12 20:44:45 crc kubenswrapper[4773]: I1012 20:44:45.597301 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64fb5d8fd7-fcnn8"] Oct 12 20:44:46 crc kubenswrapper[4773]: I1012 20:44:46.492482 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e39b6c-1aac-4a24-abe1-080e5b16ac62" path="/var/lib/kubelet/pods/c5e39b6c-1aac-4a24-abe1-080e5b16ac62/volumes" Oct 12 20:44:53 crc kubenswrapper[4773]: I1012 20:44:53.583774 4773 generic.go:334] "Generic (PLEG): container finished" podID="0b0fae69-d926-472c-a222-3a98f25a1e14" containerID="c6653c402032ae5b132e25cedc40d1727a0890a46ffadcbefbbc69278c900db9" exitCode=0 Oct 12 20:44:53 crc kubenswrapper[4773]: I1012 20:44:53.584173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0b0fae69-d926-472c-a222-3a98f25a1e14","Type":"ContainerDied","Data":"c6653c402032ae5b132e25cedc40d1727a0890a46ffadcbefbbc69278c900db9"} Oct 12 20:44:54 crc kubenswrapper[4773]: I1012 20:44:54.594878 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0b0fae69-d926-472c-a222-3a98f25a1e14","Type":"ContainerStarted","Data":"e05286be703bc041630d7017e59da29a51ae3b17c572cc9354b876344bdebd8a"} Oct 12 20:44:54 crc kubenswrapper[4773]: I1012 20:44:54.596493 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 12 20:44:54 crc kubenswrapper[4773]: I1012 20:44:54.598019 4773 generic.go:334] "Generic (PLEG): container finished" podID="3c6bb2e3-2f0e-499a-b349-07ea3eb7190d" containerID="122610d7a2bf2d91956a5a356cce19b72bff51bc81320ac9da00facb0e1f1c59" exitCode=0 Oct 12 20:44:54 crc kubenswrapper[4773]: I1012 20:44:54.598043 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d","Type":"ContainerDied","Data":"122610d7a2bf2d91956a5a356cce19b72bff51bc81320ac9da00facb0e1f1c59"} Oct 12 20:44:54 crc kubenswrapper[4773]: I1012 20:44:54.629514 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.629497597 podStartE2EDuration="36.629497597s" podCreationTimestamp="2025-10-12 20:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:44:54.621331587 +0000 UTC m=+1242.857630147" watchObservedRunningTime="2025-10-12 20:44:54.629497597 +0000 UTC m=+1242.865796157" Oct 12 20:44:55 crc kubenswrapper[4773]: I1012 20:44:55.609389 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c6bb2e3-2f0e-499a-b349-07ea3eb7190d","Type":"ContainerStarted","Data":"6b24e117abe8aa7ca8a10c48a2c25bd69412849c1958b96d1c933354db50f68f"} Oct 12 20:44:55 crc kubenswrapper[4773]: I1012 20:44:55.609932 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:44:55 crc kubenswrapper[4773]: I1012 20:44:55.645871 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.645856677 podStartE2EDuration="36.645856677s" podCreationTimestamp="2025-10-12 20:44:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:44:55.64347761 +0000 UTC m=+1243.879776170" watchObservedRunningTime="2025-10-12 20:44:55.645856677 +0000 UTC m=+1243.882155237" Oct 12 20:44:56 crc kubenswrapper[4773]: I1012 20:44:56.619312 4773 generic.go:334] "Generic (PLEG): container finished" podID="6247dcc5-2188-4112-b4b8-b53023878263" containerID="6fd9d00fd866419e79b85ba425c00b7a84997153f12f6c0c25b1b8d4de8c977c" exitCode=0 Oct 12 20:44:56 crc kubenswrapper[4773]: I1012 20:44:56.619451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" event={"ID":"6247dcc5-2188-4112-b4b8-b53023878263","Type":"ContainerDied","Data":"6fd9d00fd866419e79b85ba425c00b7a84997153f12f6c0c25b1b8d4de8c977c"} Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.111460 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.238504 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx8sw\" (UniqueName: \"kubernetes.io/projected/6247dcc5-2188-4112-b4b8-b53023878263-kube-api-access-bx8sw\") pod \"6247dcc5-2188-4112-b4b8-b53023878263\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.238602 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-inventory\") pod \"6247dcc5-2188-4112-b4b8-b53023878263\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.238629 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-repo-setup-combined-ca-bundle\") pod \"6247dcc5-2188-4112-b4b8-b53023878263\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.238663 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-ssh-key\") pod \"6247dcc5-2188-4112-b4b8-b53023878263\" (UID: \"6247dcc5-2188-4112-b4b8-b53023878263\") " Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.251758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6247dcc5-2188-4112-b4b8-b53023878263" (UID: "6247dcc5-2188-4112-b4b8-b53023878263"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.252131 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6247dcc5-2188-4112-b4b8-b53023878263-kube-api-access-bx8sw" (OuterVolumeSpecName: "kube-api-access-bx8sw") pod "6247dcc5-2188-4112-b4b8-b53023878263" (UID: "6247dcc5-2188-4112-b4b8-b53023878263"). InnerVolumeSpecName "kube-api-access-bx8sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.292076 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-inventory" (OuterVolumeSpecName: "inventory") pod "6247dcc5-2188-4112-b4b8-b53023878263" (UID: "6247dcc5-2188-4112-b4b8-b53023878263"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.327936 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6247dcc5-2188-4112-b4b8-b53023878263" (UID: "6247dcc5-2188-4112-b4b8-b53023878263"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.344146 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx8sw\" (UniqueName: \"kubernetes.io/projected/6247dcc5-2188-4112-b4b8-b53023878263-kube-api-access-bx8sw\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.344178 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.344188 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.344198 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6247dcc5-2188-4112-b4b8-b53023878263-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.638156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" event={"ID":"6247dcc5-2188-4112-b4b8-b53023878263","Type":"ContainerDied","Data":"e50d924af9c5cb37b608acf0111c9fcaba5a8a417d796a005e0fa6e213638aeb"} Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.638204 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e50d924af9c5cb37b608acf0111c9fcaba5a8a417d796a005e0fa6e213638aeb" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.638275 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.669203 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.669263 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.722461 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw"] Oct 12 20:44:58 crc kubenswrapper[4773]: E1012 20:44:58.722941 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e39b6c-1aac-4a24-abe1-080e5b16ac62" containerName="dnsmasq-dns" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.722974 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e39b6c-1aac-4a24-abe1-080e5b16ac62" containerName="dnsmasq-dns" Oct 12 20:44:58 crc kubenswrapper[4773]: E1012 20:44:58.722988 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e39b6c-1aac-4a24-abe1-080e5b16ac62" containerName="init" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.722996 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e39b6c-1aac-4a24-abe1-080e5b16ac62" containerName="init" Oct 12 20:44:58 crc kubenswrapper[4773]: E1012 20:44:58.723028 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6247dcc5-2188-4112-b4b8-b53023878263" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.723038 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6247dcc5-2188-4112-b4b8-b53023878263" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.723271 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e39b6c-1aac-4a24-abe1-080e5b16ac62" containerName="dnsmasq-dns" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.723302 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6247dcc5-2188-4112-b4b8-b53023878263" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.724068 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.733702 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw"] Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.734200 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.734200 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.734526 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.751758 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfr2p\" (UniqueName: \"kubernetes.io/projected/110bf492-a57a-4b15-9785-ba1947f4d06b-kube-api-access-qfr2p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.751837 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.751883 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.751973 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.761974 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.853540 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfr2p\" (UniqueName: \"kubernetes.io/projected/110bf492-a57a-4b15-9785-ba1947f4d06b-kube-api-access-qfr2p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.853610 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.853652 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.853762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.858425 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.858443 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.865564 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:58 crc kubenswrapper[4773]: I1012 20:44:58.872490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfr2p\" (UniqueName: \"kubernetes.io/projected/110bf492-a57a-4b15-9785-ba1947f4d06b-kube-api-access-qfr2p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:59 crc kubenswrapper[4773]: I1012 20:44:59.062675 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:44:59 crc kubenswrapper[4773]: I1012 20:44:59.645114 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw"] Oct 12 20:44:59 crc kubenswrapper[4773]: W1012 20:44:59.653534 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod110bf492_a57a_4b15_9785_ba1947f4d06b.slice/crio-38dd6e6f07df902b6559e404e51cf75fd1dd0ccc0b42116c68476ec9839f94c9 WatchSource:0}: Error finding container 38dd6e6f07df902b6559e404e51cf75fd1dd0ccc0b42116c68476ec9839f94c9: Status 404 returned error can't find the container with id 38dd6e6f07df902b6559e404e51cf75fd1dd0ccc0b42116c68476ec9839f94c9 Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.146780 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p"] Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.148154 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.150010 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.150580 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.179246 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82148a00-62d2-4597-8574-8726b05e9082-config-volume\") pod \"collect-profiles-29338365-h522p\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.179502 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82148a00-62d2-4597-8574-8726b05e9082-secret-volume\") pod \"collect-profiles-29338365-h522p\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.179609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhl49\" (UniqueName: \"kubernetes.io/projected/82148a00-62d2-4597-8574-8726b05e9082-kube-api-access-lhl49\") pod \"collect-profiles-29338365-h522p\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.196829 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p"] Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.283240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82148a00-62d2-4597-8574-8726b05e9082-secret-volume\") pod \"collect-profiles-29338365-h522p\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.283949 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhl49\" (UniqueName: \"kubernetes.io/projected/82148a00-62d2-4597-8574-8726b05e9082-kube-api-access-lhl49\") pod \"collect-profiles-29338365-h522p\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.284215 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82148a00-62d2-4597-8574-8726b05e9082-config-volume\") pod \"collect-profiles-29338365-h522p\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.285574 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82148a00-62d2-4597-8574-8726b05e9082-config-volume\") pod \"collect-profiles-29338365-h522p\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.299569 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82148a00-62d2-4597-8574-8726b05e9082-secret-volume\") pod \"collect-profiles-29338365-h522p\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.311415 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhl49\" (UniqueName: \"kubernetes.io/projected/82148a00-62d2-4597-8574-8726b05e9082-kube-api-access-lhl49\") pod \"collect-profiles-29338365-h522p\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.515782 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.681043 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" event={"ID":"110bf492-a57a-4b15-9785-ba1947f4d06b","Type":"ContainerStarted","Data":"7038f683eeca1036e9a30c385899c9aa4f981e3ad593e1dab28acbf13f2968e0"} Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.681332 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" event={"ID":"110bf492-a57a-4b15-9785-ba1947f4d06b","Type":"ContainerStarted","Data":"38dd6e6f07df902b6559e404e51cf75fd1dd0ccc0b42116c68476ec9839f94c9"} Oct 12 20:45:00 crc kubenswrapper[4773]: I1012 20:45:00.705350 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" podStartSLOduration=2.30416923 podStartE2EDuration="2.705308831s" podCreationTimestamp="2025-10-12 20:44:58 +0000 UTC" firstStartedPulling="2025-10-12 20:44:59.657522577 +0000 UTC m=+1247.893821147" lastFinishedPulling="2025-10-12 20:45:00.058662188 +0000 UTC m=+1248.294960748" observedRunningTime="2025-10-12 20:45:00.696592575 +0000 UTC m=+1248.932891135" watchObservedRunningTime="2025-10-12 20:45:00.705308831 +0000 UTC m=+1248.941607391" Oct 12 20:45:01 crc kubenswrapper[4773]: I1012 20:45:01.009956 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p"] Oct 12 20:45:01 crc kubenswrapper[4773]: W1012 20:45:01.016802 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82148a00_62d2_4597_8574_8726b05e9082.slice/crio-1ee6b7e8265d6c6744711f7d06311f4a48f455cc04e15786d2fdf205ea9051dc WatchSource:0}: Error finding container 1ee6b7e8265d6c6744711f7d06311f4a48f455cc04e15786d2fdf205ea9051dc: Status 404 returned error can't find the container with id 1ee6b7e8265d6c6744711f7d06311f4a48f455cc04e15786d2fdf205ea9051dc Oct 12 20:45:01 crc kubenswrapper[4773]: I1012 20:45:01.690174 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" event={"ID":"82148a00-62d2-4597-8574-8726b05e9082","Type":"ContainerStarted","Data":"56fd99cca96ff004c65d493eeaf6e47bdc6205ac0ad51708d842b078ec07e472"} Oct 12 20:45:01 crc kubenswrapper[4773]: I1012 20:45:01.690571 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" event={"ID":"82148a00-62d2-4597-8574-8726b05e9082","Type":"ContainerStarted","Data":"1ee6b7e8265d6c6744711f7d06311f4a48f455cc04e15786d2fdf205ea9051dc"} Oct 12 20:45:01 crc kubenswrapper[4773]: I1012 20:45:01.710854 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" podStartSLOduration=1.7108315649999999 podStartE2EDuration="1.710831565s" podCreationTimestamp="2025-10-12 20:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 20:45:01.701708498 +0000 UTC m=+1249.938007078" watchObservedRunningTime="2025-10-12 20:45:01.710831565 +0000 UTC m=+1249.947130125" Oct 12 20:45:02 crc kubenswrapper[4773]: I1012 20:45:02.699623 4773 generic.go:334] "Generic (PLEG): container finished" podID="82148a00-62d2-4597-8574-8726b05e9082" containerID="56fd99cca96ff004c65d493eeaf6e47bdc6205ac0ad51708d842b078ec07e472" exitCode=0 Oct 12 20:45:02 crc kubenswrapper[4773]: I1012 20:45:02.700008 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" event={"ID":"82148a00-62d2-4597-8574-8726b05e9082","Type":"ContainerDied","Data":"56fd99cca96ff004c65d493eeaf6e47bdc6205ac0ad51708d842b078ec07e472"} Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.062137 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.159308 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhl49\" (UniqueName: \"kubernetes.io/projected/82148a00-62d2-4597-8574-8726b05e9082-kube-api-access-lhl49\") pod \"82148a00-62d2-4597-8574-8726b05e9082\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.159495 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82148a00-62d2-4597-8574-8726b05e9082-config-volume\") pod \"82148a00-62d2-4597-8574-8726b05e9082\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.159738 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82148a00-62d2-4597-8574-8726b05e9082-secret-volume\") pod \"82148a00-62d2-4597-8574-8726b05e9082\" (UID: \"82148a00-62d2-4597-8574-8726b05e9082\") " Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.160227 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82148a00-62d2-4597-8574-8726b05e9082-config-volume" (OuterVolumeSpecName: "config-volume") pod "82148a00-62d2-4597-8574-8726b05e9082" (UID: "82148a00-62d2-4597-8574-8726b05e9082"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.160447 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82148a00-62d2-4597-8574-8726b05e9082-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.165712 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82148a00-62d2-4597-8574-8726b05e9082-kube-api-access-lhl49" (OuterVolumeSpecName: "kube-api-access-lhl49") pod "82148a00-62d2-4597-8574-8726b05e9082" (UID: "82148a00-62d2-4597-8574-8726b05e9082"). InnerVolumeSpecName "kube-api-access-lhl49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.166312 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82148a00-62d2-4597-8574-8726b05e9082-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "82148a00-62d2-4597-8574-8726b05e9082" (UID: "82148a00-62d2-4597-8574-8726b05e9082"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.262286 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82148a00-62d2-4597-8574-8726b05e9082-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.262325 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhl49\" (UniqueName: \"kubernetes.io/projected/82148a00-62d2-4597-8574-8726b05e9082-kube-api-access-lhl49\") on node \"crc\" DevicePath \"\"" Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.718166 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" event={"ID":"82148a00-62d2-4597-8574-8726b05e9082","Type":"ContainerDied","Data":"1ee6b7e8265d6c6744711f7d06311f4a48f455cc04e15786d2fdf205ea9051dc"} Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.718203 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee6b7e8265d6c6744711f7d06311f4a48f455cc04e15786d2fdf205ea9051dc" Oct 12 20:45:04 crc kubenswrapper[4773]: I1012 20:45:04.718236 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p" Oct 12 20:45:08 crc kubenswrapper[4773]: I1012 20:45:08.612957 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 12 20:45:09 crc kubenswrapper[4773]: I1012 20:45:09.648960 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 12 20:45:28 crc kubenswrapper[4773]: I1012 20:45:28.669243 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:45:28 crc kubenswrapper[4773]: I1012 20:45:28.669847 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:45:58 crc kubenswrapper[4773]: I1012 20:45:58.670028 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:45:58 crc kubenswrapper[4773]: I1012 20:45:58.670456 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:45:58 crc kubenswrapper[4773]: I1012 20:45:58.670508 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:45:58 crc kubenswrapper[4773]: I1012 20:45:58.671164 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c2267874ca0a1e2b858f357e588e7faae20739dfdbee651abb17c4b8b4bc171"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 20:45:58 crc kubenswrapper[4773]: I1012 20:45:58.671210 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://4c2267874ca0a1e2b858f357e588e7faae20739dfdbee651abb17c4b8b4bc171" gracePeriod=600 Oct 12 20:45:59 crc kubenswrapper[4773]: I1012 20:45:59.272849 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="4c2267874ca0a1e2b858f357e588e7faae20739dfdbee651abb17c4b8b4bc171" exitCode=0 Oct 12 20:45:59 crc kubenswrapper[4773]: I1012 20:45:59.272915 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"4c2267874ca0a1e2b858f357e588e7faae20739dfdbee651abb17c4b8b4bc171"} Oct 12 20:45:59 crc kubenswrapper[4773]: I1012 20:45:59.273294 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97"} Oct 12 20:45:59 crc kubenswrapper[4773]: I1012 20:45:59.273318 4773 scope.go:117] "RemoveContainer" containerID="0933e9f4241f82c41af0f2d2f4870feff1ad7b281c06f5be9e23c636fa021737" Oct 12 20:46:31 crc kubenswrapper[4773]: I1012 20:46:31.511756 4773 scope.go:117] "RemoveContainer" containerID="004096eccb5691b26964f28d37cff6f433a5fade0eb5348d8c059f600f1f818a" Oct 12 20:47:31 crc kubenswrapper[4773]: I1012 20:47:31.574607 4773 scope.go:117] "RemoveContainer" containerID="6f9cd7387b9163346d2c04ee47ba4432a434c305d3d2db5c4a9c41f348070865" Oct 12 20:48:05 crc kubenswrapper[4773]: I1012 20:48:05.585133 4773 generic.go:334] "Generic (PLEG): container finished" podID="110bf492-a57a-4b15-9785-ba1947f4d06b" containerID="7038f683eeca1036e9a30c385899c9aa4f981e3ad593e1dab28acbf13f2968e0" exitCode=0 Oct 12 20:48:05 crc kubenswrapper[4773]: I1012 20:48:05.585242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" event={"ID":"110bf492-a57a-4b15-9785-ba1947f4d06b","Type":"ContainerDied","Data":"7038f683eeca1036e9a30c385899c9aa4f981e3ad593e1dab28acbf13f2968e0"} Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.055758 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.103746 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-inventory\") pod \"110bf492-a57a-4b15-9785-ba1947f4d06b\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.104101 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-bootstrap-combined-ca-bundle\") pod \"110bf492-a57a-4b15-9785-ba1947f4d06b\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.104261 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-ssh-key\") pod \"110bf492-a57a-4b15-9785-ba1947f4d06b\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.104414 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfr2p\" (UniqueName: \"kubernetes.io/projected/110bf492-a57a-4b15-9785-ba1947f4d06b-kube-api-access-qfr2p\") pod \"110bf492-a57a-4b15-9785-ba1947f4d06b\" (UID: \"110bf492-a57a-4b15-9785-ba1947f4d06b\") " Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.118928 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "110bf492-a57a-4b15-9785-ba1947f4d06b" (UID: "110bf492-a57a-4b15-9785-ba1947f4d06b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.121555 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110bf492-a57a-4b15-9785-ba1947f4d06b-kube-api-access-qfr2p" (OuterVolumeSpecName: "kube-api-access-qfr2p") pod "110bf492-a57a-4b15-9785-ba1947f4d06b" (UID: "110bf492-a57a-4b15-9785-ba1947f4d06b"). InnerVolumeSpecName "kube-api-access-qfr2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.137473 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "110bf492-a57a-4b15-9785-ba1947f4d06b" (UID: "110bf492-a57a-4b15-9785-ba1947f4d06b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.156283 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-inventory" (OuterVolumeSpecName: "inventory") pod "110bf492-a57a-4b15-9785-ba1947f4d06b" (UID: "110bf492-a57a-4b15-9785-ba1947f4d06b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.206777 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.206816 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfr2p\" (UniqueName: \"kubernetes.io/projected/110bf492-a57a-4b15-9785-ba1947f4d06b-kube-api-access-qfr2p\") on node \"crc\" DevicePath \"\"" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.206830 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.206846 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110bf492-a57a-4b15-9785-ba1947f4d06b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.606870 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" event={"ID":"110bf492-a57a-4b15-9785-ba1947f4d06b","Type":"ContainerDied","Data":"38dd6e6f07df902b6559e404e51cf75fd1dd0ccc0b42116c68476ec9839f94c9"} Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.606905 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38dd6e6f07df902b6559e404e51cf75fd1dd0ccc0b42116c68476ec9839f94c9" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.606961 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.695178 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l"] Oct 12 20:48:07 crc kubenswrapper[4773]: E1012 20:48:07.695591 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110bf492-a57a-4b15-9785-ba1947f4d06b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.695613 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="110bf492-a57a-4b15-9785-ba1947f4d06b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 20:48:07 crc kubenswrapper[4773]: E1012 20:48:07.695647 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82148a00-62d2-4597-8574-8726b05e9082" containerName="collect-profiles" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.695656 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="82148a00-62d2-4597-8574-8726b05e9082" containerName="collect-profiles" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.695920 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="82148a00-62d2-4597-8574-8726b05e9082" containerName="collect-profiles" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.695947 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="110bf492-a57a-4b15-9785-ba1947f4d06b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.696762 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.700429 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.700469 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.700644 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.700732 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.716384 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6m46l\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.716522 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txnpg\" (UniqueName: \"kubernetes.io/projected/1aa00062-9363-462c-93d6-5552cf5d1c9c-kube-api-access-txnpg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6m46l\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.716728 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6m46l\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.726242 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l"] Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.819117 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txnpg\" (UniqueName: \"kubernetes.io/projected/1aa00062-9363-462c-93d6-5552cf5d1c9c-kube-api-access-txnpg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6m46l\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.819280 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6m46l\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.819403 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6m46l\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.823264 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6m46l\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.826122 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6m46l\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:07 crc kubenswrapper[4773]: I1012 20:48:07.844391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txnpg\" (UniqueName: \"kubernetes.io/projected/1aa00062-9363-462c-93d6-5552cf5d1c9c-kube-api-access-txnpg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6m46l\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:08 crc kubenswrapper[4773]: I1012 20:48:08.013299 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:48:08 crc kubenswrapper[4773]: I1012 20:48:08.345220 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l"] Oct 12 20:48:08 crc kubenswrapper[4773]: I1012 20:48:08.619186 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" event={"ID":"1aa00062-9363-462c-93d6-5552cf5d1c9c","Type":"ContainerStarted","Data":"6b8ef63f7610b5153966806e11d7fcb8655e86ca1b9c19567b645e165bf86561"} Oct 12 20:48:09 crc kubenswrapper[4773]: I1012 20:48:09.629175 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" event={"ID":"1aa00062-9363-462c-93d6-5552cf5d1c9c","Type":"ContainerStarted","Data":"1ae31120e8d01dd8881b0c06c40db8e736d61798f1e385c3dd224952e0977dd3"} Oct 12 20:48:09 crc kubenswrapper[4773]: I1012 20:48:09.642527 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" podStartSLOduration=2.043048586 podStartE2EDuration="2.642512006s" podCreationTimestamp="2025-10-12 20:48:07 +0000 UTC" firstStartedPulling="2025-10-12 20:48:08.352270635 +0000 UTC m=+1436.588569185" lastFinishedPulling="2025-10-12 20:48:08.951734045 +0000 UTC m=+1437.188032605" observedRunningTime="2025-10-12 20:48:09.642372363 +0000 UTC m=+1437.878670923" watchObservedRunningTime="2025-10-12 20:48:09.642512006 +0000 UTC m=+1437.878810566" Oct 12 20:48:28 crc kubenswrapper[4773]: I1012 20:48:28.670185 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:48:28 crc kubenswrapper[4773]: I1012 20:48:28.670667 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:48:37 crc kubenswrapper[4773]: I1012 20:48:37.848531 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gc659"] Oct 12 20:48:37 crc kubenswrapper[4773]: I1012 20:48:37.852773 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:37 crc kubenswrapper[4773]: I1012 20:48:37.867947 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gc659"] Oct 12 20:48:37 crc kubenswrapper[4773]: I1012 20:48:37.900140 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-utilities\") pod \"redhat-marketplace-gc659\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:37 crc kubenswrapper[4773]: I1012 20:48:37.900207 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-catalog-content\") pod \"redhat-marketplace-gc659\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:37 crc kubenswrapper[4773]: I1012 20:48:37.900295 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gd7m\" (UniqueName: \"kubernetes.io/projected/ee8e430d-744a-4ce0-8ad4-e17702c25093-kube-api-access-7gd7m\") pod \"redhat-marketplace-gc659\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.001682 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-utilities\") pod \"redhat-marketplace-gc659\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.001960 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-catalog-content\") pod \"redhat-marketplace-gc659\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.002086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gd7m\" (UniqueName: \"kubernetes.io/projected/ee8e430d-744a-4ce0-8ad4-e17702c25093-kube-api-access-7gd7m\") pod \"redhat-marketplace-gc659\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.002370 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-utilities\") pod \"redhat-marketplace-gc659\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.002393 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-catalog-content\") pod \"redhat-marketplace-gc659\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.025598 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gd7m\" (UniqueName: \"kubernetes.io/projected/ee8e430d-744a-4ce0-8ad4-e17702c25093-kube-api-access-7gd7m\") pod \"redhat-marketplace-gc659\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.179691 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.659764 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gc659"] Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.908984 4773 generic.go:334] "Generic (PLEG): container finished" podID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerID="73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5" exitCode=0 Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.909058 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gc659" event={"ID":"ee8e430d-744a-4ce0-8ad4-e17702c25093","Type":"ContainerDied","Data":"73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5"} Oct 12 20:48:38 crc kubenswrapper[4773]: I1012 20:48:38.909099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gc659" event={"ID":"ee8e430d-744a-4ce0-8ad4-e17702c25093","Type":"ContainerStarted","Data":"af1b101f09b14e5c86278698cca4ea0b7173e5b190c2865844655aa48323d8be"} Oct 12 20:48:40 crc kubenswrapper[4773]: I1012 20:48:40.938518 4773 generic.go:334] "Generic (PLEG): container finished" podID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerID="ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d" exitCode=0 Oct 12 20:48:40 crc kubenswrapper[4773]: I1012 20:48:40.938992 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gc659" event={"ID":"ee8e430d-744a-4ce0-8ad4-e17702c25093","Type":"ContainerDied","Data":"ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d"} Oct 12 20:48:41 crc kubenswrapper[4773]: I1012 20:48:41.949155 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gc659" event={"ID":"ee8e430d-744a-4ce0-8ad4-e17702c25093","Type":"ContainerStarted","Data":"f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec"} Oct 12 20:48:41 crc kubenswrapper[4773]: I1012 20:48:41.974101 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gc659" podStartSLOduration=2.568484527 podStartE2EDuration="4.974082204s" podCreationTimestamp="2025-10-12 20:48:37 +0000 UTC" firstStartedPulling="2025-10-12 20:48:38.915174828 +0000 UTC m=+1467.151473388" lastFinishedPulling="2025-10-12 20:48:41.320772495 +0000 UTC m=+1469.557071065" observedRunningTime="2025-10-12 20:48:41.971692417 +0000 UTC m=+1470.207990977" watchObservedRunningTime="2025-10-12 20:48:41.974082204 +0000 UTC m=+1470.210380774" Oct 12 20:48:48 crc kubenswrapper[4773]: I1012 20:48:48.180820 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:48 crc kubenswrapper[4773]: I1012 20:48:48.181364 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:48 crc kubenswrapper[4773]: I1012 20:48:48.231299 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:49 crc kubenswrapper[4773]: I1012 20:48:49.078149 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:49 crc kubenswrapper[4773]: I1012 20:48:49.144574 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gc659"] Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.020490 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gc659" podUID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerName="registry-server" containerID="cri-o://f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec" gracePeriod=2 Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.432929 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.603221 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-catalog-content\") pod \"ee8e430d-744a-4ce0-8ad4-e17702c25093\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.603359 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-utilities\") pod \"ee8e430d-744a-4ce0-8ad4-e17702c25093\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.603426 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gd7m\" (UniqueName: \"kubernetes.io/projected/ee8e430d-744a-4ce0-8ad4-e17702c25093-kube-api-access-7gd7m\") pod \"ee8e430d-744a-4ce0-8ad4-e17702c25093\" (UID: \"ee8e430d-744a-4ce0-8ad4-e17702c25093\") " Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.604189 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-utilities" (OuterVolumeSpecName: "utilities") pod "ee8e430d-744a-4ce0-8ad4-e17702c25093" (UID: "ee8e430d-744a-4ce0-8ad4-e17702c25093"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.614850 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8e430d-744a-4ce0-8ad4-e17702c25093-kube-api-access-7gd7m" (OuterVolumeSpecName: "kube-api-access-7gd7m") pod "ee8e430d-744a-4ce0-8ad4-e17702c25093" (UID: "ee8e430d-744a-4ce0-8ad4-e17702c25093"). InnerVolumeSpecName "kube-api-access-7gd7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.625769 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee8e430d-744a-4ce0-8ad4-e17702c25093" (UID: "ee8e430d-744a-4ce0-8ad4-e17702c25093"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.706461 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.706760 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8e430d-744a-4ce0-8ad4-e17702c25093-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:48:51 crc kubenswrapper[4773]: I1012 20:48:51.706771 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gd7m\" (UniqueName: \"kubernetes.io/projected/ee8e430d-744a-4ce0-8ad4-e17702c25093-kube-api-access-7gd7m\") on node \"crc\" DevicePath \"\"" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.029780 4773 generic.go:334] "Generic (PLEG): container finished" podID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerID="f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec" exitCode=0 Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.029823 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gc659" event={"ID":"ee8e430d-744a-4ce0-8ad4-e17702c25093","Type":"ContainerDied","Data":"f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec"} Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.029838 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gc659" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.029853 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gc659" event={"ID":"ee8e430d-744a-4ce0-8ad4-e17702c25093","Type":"ContainerDied","Data":"af1b101f09b14e5c86278698cca4ea0b7173e5b190c2865844655aa48323d8be"} Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.029874 4773 scope.go:117] "RemoveContainer" containerID="f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.052544 4773 scope.go:117] "RemoveContainer" containerID="ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.072038 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gc659"] Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.079781 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gc659"] Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.083833 4773 scope.go:117] "RemoveContainer" containerID="73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.128141 4773 scope.go:117] "RemoveContainer" containerID="f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec" Oct 12 20:48:52 crc kubenswrapper[4773]: E1012 20:48:52.128665 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec\": container with ID starting with f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec not found: ID does not exist" containerID="f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.128706 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec"} err="failed to get container status \"f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec\": rpc error: code = NotFound desc = could not find container \"f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec\": container with ID starting with f103c4dc1a13a6ea15443f98dc651e3303b65617e1987ce3000b6ce2ad45e2ec not found: ID does not exist" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.128748 4773 scope.go:117] "RemoveContainer" containerID="ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d" Oct 12 20:48:52 crc kubenswrapper[4773]: E1012 20:48:52.129027 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d\": container with ID starting with ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d not found: ID does not exist" containerID="ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.129069 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d"} err="failed to get container status \"ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d\": rpc error: code = NotFound desc = could not find container \"ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d\": container with ID starting with ac44695c88c1683acc46988b6ec54ae355a5abae68a5b7f4dceb6fbfd0b1750d not found: ID does not exist" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.129096 4773 scope.go:117] "RemoveContainer" containerID="73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5" Oct 12 20:48:52 crc kubenswrapper[4773]: E1012 20:48:52.129429 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5\": container with ID starting with 73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5 not found: ID does not exist" containerID="73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.129469 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5"} err="failed to get container status \"73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5\": rpc error: code = NotFound desc = could not find container \"73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5\": container with ID starting with 73c3a7cc3cebe36162ac56d8b22986dfdeee145d91e103ed2fe1a87d835148c5 not found: ID does not exist" Oct 12 20:48:52 crc kubenswrapper[4773]: I1012 20:48:52.497276 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee8e430d-744a-4ce0-8ad4-e17702c25093" path="/var/lib/kubelet/pods/ee8e430d-744a-4ce0-8ad4-e17702c25093/volumes" Oct 12 20:48:58 crc kubenswrapper[4773]: I1012 20:48:58.669625 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:48:58 crc kubenswrapper[4773]: I1012 20:48:58.670252 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:49:21 crc kubenswrapper[4773]: I1012 20:49:21.318933 4773 generic.go:334] "Generic (PLEG): container finished" podID="1aa00062-9363-462c-93d6-5552cf5d1c9c" containerID="1ae31120e8d01dd8881b0c06c40db8e736d61798f1e385c3dd224952e0977dd3" exitCode=0 Oct 12 20:49:21 crc kubenswrapper[4773]: I1012 20:49:21.319052 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" event={"ID":"1aa00062-9363-462c-93d6-5552cf5d1c9c","Type":"ContainerDied","Data":"1ae31120e8d01dd8881b0c06c40db8e736d61798f1e385c3dd224952e0977dd3"} Oct 12 20:49:22 crc kubenswrapper[4773]: I1012 20:49:22.752446 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:49:22 crc kubenswrapper[4773]: I1012 20:49:22.929265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-ssh-key\") pod \"1aa00062-9363-462c-93d6-5552cf5d1c9c\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " Oct 12 20:49:22 crc kubenswrapper[4773]: I1012 20:49:22.929865 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-inventory\") pod \"1aa00062-9363-462c-93d6-5552cf5d1c9c\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " Oct 12 20:49:22 crc kubenswrapper[4773]: I1012 20:49:22.930066 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txnpg\" (UniqueName: \"kubernetes.io/projected/1aa00062-9363-462c-93d6-5552cf5d1c9c-kube-api-access-txnpg\") pod \"1aa00062-9363-462c-93d6-5552cf5d1c9c\" (UID: \"1aa00062-9363-462c-93d6-5552cf5d1c9c\") " Oct 12 20:49:22 crc kubenswrapper[4773]: I1012 20:49:22.942016 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa00062-9363-462c-93d6-5552cf5d1c9c-kube-api-access-txnpg" (OuterVolumeSpecName: "kube-api-access-txnpg") pod "1aa00062-9363-462c-93d6-5552cf5d1c9c" (UID: "1aa00062-9363-462c-93d6-5552cf5d1c9c"). InnerVolumeSpecName "kube-api-access-txnpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:49:22 crc kubenswrapper[4773]: I1012 20:49:22.960099 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1aa00062-9363-462c-93d6-5552cf5d1c9c" (UID: "1aa00062-9363-462c-93d6-5552cf5d1c9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:49:22 crc kubenswrapper[4773]: I1012 20:49:22.960981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-inventory" (OuterVolumeSpecName: "inventory") pod "1aa00062-9363-462c-93d6-5552cf5d1c9c" (UID: "1aa00062-9363-462c-93d6-5552cf5d1c9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.032710 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.032749 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aa00062-9363-462c-93d6-5552cf5d1c9c-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.032761 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txnpg\" (UniqueName: \"kubernetes.io/projected/1aa00062-9363-462c-93d6-5552cf5d1c9c-kube-api-access-txnpg\") on node \"crc\" DevicePath \"\"" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.345157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" event={"ID":"1aa00062-9363-462c-93d6-5552cf5d1c9c","Type":"ContainerDied","Data":"6b8ef63f7610b5153966806e11d7fcb8655e86ca1b9c19567b645e165bf86561"} Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.345219 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8ef63f7610b5153966806e11d7fcb8655e86ca1b9c19567b645e165bf86561" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.346308 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.416191 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266"] Oct 12 20:49:23 crc kubenswrapper[4773]: E1012 20:49:23.416808 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerName="registry-server" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.416823 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerName="registry-server" Oct 12 20:49:23 crc kubenswrapper[4773]: E1012 20:49:23.416847 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerName="extract-content" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.416853 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerName="extract-content" Oct 12 20:49:23 crc kubenswrapper[4773]: E1012 20:49:23.416885 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerName="extract-utilities" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.416892 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerName="extract-utilities" Oct 12 20:49:23 crc kubenswrapper[4773]: E1012 20:49:23.416910 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa00062-9363-462c-93d6-5552cf5d1c9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.416918 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa00062-9363-462c-93d6-5552cf5d1c9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.417088 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8e430d-744a-4ce0-8ad4-e17702c25093" containerName="registry-server" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.417105 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa00062-9363-462c-93d6-5552cf5d1c9c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.417675 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.420620 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.420823 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.420954 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.421069 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.435056 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266"] Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.544488 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnv8m\" (UniqueName: \"kubernetes.io/projected/b4a89819-9b05-4de3-988d-47f353e73656-kube-api-access-wnv8m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9s266\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.544560 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9s266\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.544772 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9s266\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.646378 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9s266\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.646524 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnv8m\" (UniqueName: \"kubernetes.io/projected/b4a89819-9b05-4de3-988d-47f353e73656-kube-api-access-wnv8m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9s266\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.646565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9s266\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.651057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9s266\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.655362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9s266\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.665056 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnv8m\" (UniqueName: \"kubernetes.io/projected/b4a89819-9b05-4de3-988d-47f353e73656-kube-api-access-wnv8m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-9s266\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:23 crc kubenswrapper[4773]: I1012 20:49:23.743846 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:24 crc kubenswrapper[4773]: I1012 20:49:24.339773 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266"] Oct 12 20:49:24 crc kubenswrapper[4773]: I1012 20:49:24.355266 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" event={"ID":"b4a89819-9b05-4de3-988d-47f353e73656","Type":"ContainerStarted","Data":"210c9d9666b4448d0782868e93cfc5bc1e25817f23ea31dbe2c3e86eb1c71c31"} Oct 12 20:49:25 crc kubenswrapper[4773]: I1012 20:49:25.384481 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" event={"ID":"b4a89819-9b05-4de3-988d-47f353e73656","Type":"ContainerStarted","Data":"a238ca7d09f0698f071611bc974ca37de1f7baa4ac653ba8d876b48ab9f78b6e"} Oct 12 20:49:25 crc kubenswrapper[4773]: I1012 20:49:25.415210 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" podStartSLOduration=1.9087910940000001 podStartE2EDuration="2.415190354s" podCreationTimestamp="2025-10-12 20:49:23 +0000 UTC" firstStartedPulling="2025-10-12 20:49:24.327183151 +0000 UTC m=+1512.563481711" lastFinishedPulling="2025-10-12 20:49:24.833582411 +0000 UTC m=+1513.069880971" observedRunningTime="2025-10-12 20:49:25.404115446 +0000 UTC m=+1513.640414016" watchObservedRunningTime="2025-10-12 20:49:25.415190354 +0000 UTC m=+1513.651488914" Oct 12 20:49:28 crc kubenswrapper[4773]: I1012 20:49:28.669569 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:49:28 crc kubenswrapper[4773]: I1012 20:49:28.669924 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:49:28 crc kubenswrapper[4773]: I1012 20:49:28.669972 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:49:28 crc kubenswrapper[4773]: I1012 20:49:28.670651 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 20:49:28 crc kubenswrapper[4773]: I1012 20:49:28.670735 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" gracePeriod=600 Oct 12 20:49:28 crc kubenswrapper[4773]: E1012 20:49:28.795090 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:49:29 crc kubenswrapper[4773]: I1012 20:49:29.431690 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" exitCode=0 Oct 12 20:49:29 crc kubenswrapper[4773]: I1012 20:49:29.431748 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97"} Oct 12 20:49:29 crc kubenswrapper[4773]: I1012 20:49:29.431801 4773 scope.go:117] "RemoveContainer" containerID="4c2267874ca0a1e2b858f357e588e7faae20739dfdbee651abb17c4b8b4bc171" Oct 12 20:49:29 crc kubenswrapper[4773]: I1012 20:49:29.432396 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:49:29 crc kubenswrapper[4773]: E1012 20:49:29.432626 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:49:31 crc kubenswrapper[4773]: I1012 20:49:31.457352 4773 generic.go:334] "Generic (PLEG): container finished" podID="b4a89819-9b05-4de3-988d-47f353e73656" containerID="a238ca7d09f0698f071611bc974ca37de1f7baa4ac653ba8d876b48ab9f78b6e" exitCode=0 Oct 12 20:49:31 crc kubenswrapper[4773]: I1012 20:49:31.457514 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" event={"ID":"b4a89819-9b05-4de3-988d-47f353e73656","Type":"ContainerDied","Data":"a238ca7d09f0698f071611bc974ca37de1f7baa4ac653ba8d876b48ab9f78b6e"} Oct 12 20:49:32 crc kubenswrapper[4773]: I1012 20:49:32.925322 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.012345 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnv8m\" (UniqueName: \"kubernetes.io/projected/b4a89819-9b05-4de3-988d-47f353e73656-kube-api-access-wnv8m\") pod \"b4a89819-9b05-4de3-988d-47f353e73656\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.012430 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-ssh-key\") pod \"b4a89819-9b05-4de3-988d-47f353e73656\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.012512 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-inventory\") pod \"b4a89819-9b05-4de3-988d-47f353e73656\" (UID: \"b4a89819-9b05-4de3-988d-47f353e73656\") " Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.021983 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a89819-9b05-4de3-988d-47f353e73656-kube-api-access-wnv8m" (OuterVolumeSpecName: "kube-api-access-wnv8m") pod "b4a89819-9b05-4de3-988d-47f353e73656" (UID: "b4a89819-9b05-4de3-988d-47f353e73656"). InnerVolumeSpecName "kube-api-access-wnv8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.036534 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-inventory" (OuterVolumeSpecName: "inventory") pod "b4a89819-9b05-4de3-988d-47f353e73656" (UID: "b4a89819-9b05-4de3-988d-47f353e73656"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.040922 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b4a89819-9b05-4de3-988d-47f353e73656" (UID: "b4a89819-9b05-4de3-988d-47f353e73656"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.114593 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.114628 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4a89819-9b05-4de3-988d-47f353e73656-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.114642 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnv8m\" (UniqueName: \"kubernetes.io/projected/b4a89819-9b05-4de3-988d-47f353e73656-kube-api-access-wnv8m\") on node \"crc\" DevicePath \"\"" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.478093 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" event={"ID":"b4a89819-9b05-4de3-988d-47f353e73656","Type":"ContainerDied","Data":"210c9d9666b4448d0782868e93cfc5bc1e25817f23ea31dbe2c3e86eb1c71c31"} Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.478383 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="210c9d9666b4448d0782868e93cfc5bc1e25817f23ea31dbe2c3e86eb1c71c31" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.478201 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.555316 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m"] Oct 12 20:49:33 crc kubenswrapper[4773]: E1012 20:49:33.555706 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a89819-9b05-4de3-988d-47f353e73656" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.555738 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a89819-9b05-4de3-988d-47f353e73656" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.555942 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a89819-9b05-4de3-988d-47f353e73656" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.556471 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.558952 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.559241 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.565275 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.574316 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m"] Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.575030 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.623915 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-x242m\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.624251 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-x242m\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.624366 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6jr6\" (UniqueName: \"kubernetes.io/projected/f23068cc-f12e-482e-894c-701621db18e3-kube-api-access-z6jr6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-x242m\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.726256 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-x242m\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.726443 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-x242m\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.726585 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6jr6\" (UniqueName: \"kubernetes.io/projected/f23068cc-f12e-482e-894c-701621db18e3-kube-api-access-z6jr6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-x242m\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.730464 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-x242m\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.732675 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-x242m\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.743927 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6jr6\" (UniqueName: \"kubernetes.io/projected/f23068cc-f12e-482e-894c-701621db18e3-kube-api-access-z6jr6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-x242m\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:33 crc kubenswrapper[4773]: I1012 20:49:33.872128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:49:34 crc kubenswrapper[4773]: I1012 20:49:34.412589 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m"] Oct 12 20:49:34 crc kubenswrapper[4773]: I1012 20:49:34.490579 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" event={"ID":"f23068cc-f12e-482e-894c-701621db18e3","Type":"ContainerStarted","Data":"7d604ba05868665133eae25226a985cb53d71c5cfaae9222c3644e7e6611fa34"} Oct 12 20:49:35 crc kubenswrapper[4773]: I1012 20:49:35.499543 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" event={"ID":"f23068cc-f12e-482e-894c-701621db18e3","Type":"ContainerStarted","Data":"da7cfea6b651bdb22f9461bff7616e70f28be5c42e4bc498c5d85595d9ea7253"} Oct 12 20:49:35 crc kubenswrapper[4773]: I1012 20:49:35.516390 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" podStartSLOduration=2.125912949 podStartE2EDuration="2.516367684s" podCreationTimestamp="2025-10-12 20:49:33 +0000 UTC" firstStartedPulling="2025-10-12 20:49:34.41942971 +0000 UTC m=+1522.655728290" lastFinishedPulling="2025-10-12 20:49:34.809884465 +0000 UTC m=+1523.046183025" observedRunningTime="2025-10-12 20:49:35.511378955 +0000 UTC m=+1523.747677515" watchObservedRunningTime="2025-10-12 20:49:35.516367684 +0000 UTC m=+1523.752666264" Oct 12 20:49:44 crc kubenswrapper[4773]: I1012 20:49:44.481417 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:49:44 crc kubenswrapper[4773]: E1012 20:49:44.482186 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:49:49 crc kubenswrapper[4773]: I1012 20:49:49.068764 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-brkxr"] Oct 12 20:49:49 crc kubenswrapper[4773]: I1012 20:49:49.082068 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-brkxr"] Oct 12 20:49:50 crc kubenswrapper[4773]: I1012 20:49:50.494533 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41844950-f353-4515-940a-61329fbb3d5f" path="/var/lib/kubelet/pods/41844950-f353-4515-940a-61329fbb3d5f/volumes" Oct 12 20:49:53 crc kubenswrapper[4773]: I1012 20:49:53.035913 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4qddn"] Oct 12 20:49:53 crc kubenswrapper[4773]: I1012 20:49:53.045195 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-b96j8"] Oct 12 20:49:53 crc kubenswrapper[4773]: I1012 20:49:53.056928 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4qddn"] Oct 12 20:49:53 crc kubenswrapper[4773]: I1012 20:49:53.065774 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-b96j8"] Oct 12 20:49:54 crc kubenswrapper[4773]: I1012 20:49:54.496703 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1624c85d-701b-4850-a453-33f18a09e91a" path="/var/lib/kubelet/pods/1624c85d-701b-4850-a453-33f18a09e91a/volumes" Oct 12 20:49:54 crc kubenswrapper[4773]: I1012 20:49:54.497760 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e218f6c6-b871-4b56-94a8-64ea740b4b9f" path="/var/lib/kubelet/pods/e218f6c6-b871-4b56-94a8-64ea740b4b9f/volumes" Oct 12 20:49:55 crc kubenswrapper[4773]: I1012 20:49:55.481127 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:49:55 crc kubenswrapper[4773]: E1012 20:49:55.481518 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:49:59 crc kubenswrapper[4773]: I1012 20:49:59.044784 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-66fa-account-create-mvgsr"] Oct 12 20:49:59 crc kubenswrapper[4773]: I1012 20:49:59.058355 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-66fa-account-create-mvgsr"] Oct 12 20:50:00 crc kubenswrapper[4773]: I1012 20:50:00.496956 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed5c0a6-a628-4c72-acf3-f61de6844a5e" path="/var/lib/kubelet/pods/7ed5c0a6-a628-4c72-acf3-f61de6844a5e/volumes" Oct 12 20:50:03 crc kubenswrapper[4773]: I1012 20:50:03.043389 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f309-account-create-f5mnr"] Oct 12 20:50:03 crc kubenswrapper[4773]: I1012 20:50:03.056201 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b8df-account-create-vsp9w"] Oct 12 20:50:03 crc kubenswrapper[4773]: I1012 20:50:03.063022 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f309-account-create-f5mnr"] Oct 12 20:50:03 crc kubenswrapper[4773]: I1012 20:50:03.069569 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b8df-account-create-vsp9w"] Oct 12 20:50:04 crc kubenswrapper[4773]: I1012 20:50:04.501403 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ef866a-4e30-45b3-b35f-18ebf81b265d" path="/var/lib/kubelet/pods/01ef866a-4e30-45b3-b35f-18ebf81b265d/volumes" Oct 12 20:50:04 crc kubenswrapper[4773]: I1012 20:50:04.502759 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c621cba-c18f-4ffd-9685-66cc229b846e" path="/var/lib/kubelet/pods/0c621cba-c18f-4ffd-9685-66cc229b846e/volumes" Oct 12 20:50:07 crc kubenswrapper[4773]: I1012 20:50:07.484040 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:50:07 crc kubenswrapper[4773]: E1012 20:50:07.485077 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:50:19 crc kubenswrapper[4773]: I1012 20:50:19.480639 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:50:19 crc kubenswrapper[4773]: E1012 20:50:19.481498 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:50:19 crc kubenswrapper[4773]: E1012 20:50:19.912378 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf23068cc_f12e_482e_894c_701621db18e3.slice/crio-conmon-da7cfea6b651bdb22f9461bff7616e70f28be5c42e4bc498c5d85595d9ea7253.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf23068cc_f12e_482e_894c_701621db18e3.slice/crio-da7cfea6b651bdb22f9461bff7616e70f28be5c42e4bc498c5d85595d9ea7253.scope\": RecentStats: unable to find data in memory cache]" Oct 12 20:50:20 crc kubenswrapper[4773]: I1012 20:50:20.025087 4773 generic.go:334] "Generic (PLEG): container finished" podID="f23068cc-f12e-482e-894c-701621db18e3" containerID="da7cfea6b651bdb22f9461bff7616e70f28be5c42e4bc498c5d85595d9ea7253" exitCode=0 Oct 12 20:50:20 crc kubenswrapper[4773]: I1012 20:50:20.025259 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" event={"ID":"f23068cc-f12e-482e-894c-701621db18e3","Type":"ContainerDied","Data":"da7cfea6b651bdb22f9461bff7616e70f28be5c42e4bc498c5d85595d9ea7253"} Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.041593 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-cmc4s"] Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.059282 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-f7v8p"] Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.072816 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-f7v8p"] Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.085119 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-mktjz"] Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.093600 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-cmc4s"] Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.100024 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-mktjz"] Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.405932 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.477112 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-ssh-key\") pod \"f23068cc-f12e-482e-894c-701621db18e3\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.477158 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-inventory\") pod \"f23068cc-f12e-482e-894c-701621db18e3\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.477229 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6jr6\" (UniqueName: \"kubernetes.io/projected/f23068cc-f12e-482e-894c-701621db18e3-kube-api-access-z6jr6\") pod \"f23068cc-f12e-482e-894c-701621db18e3\" (UID: \"f23068cc-f12e-482e-894c-701621db18e3\") " Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.494066 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23068cc-f12e-482e-894c-701621db18e3-kube-api-access-z6jr6" (OuterVolumeSpecName: "kube-api-access-z6jr6") pod "f23068cc-f12e-482e-894c-701621db18e3" (UID: "f23068cc-f12e-482e-894c-701621db18e3"). InnerVolumeSpecName "kube-api-access-z6jr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.501823 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-inventory" (OuterVolumeSpecName: "inventory") pod "f23068cc-f12e-482e-894c-701621db18e3" (UID: "f23068cc-f12e-482e-894c-701621db18e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.509660 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f23068cc-f12e-482e-894c-701621db18e3" (UID: "f23068cc-f12e-482e-894c-701621db18e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.580583 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.580621 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f23068cc-f12e-482e-894c-701621db18e3-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:50:21 crc kubenswrapper[4773]: I1012 20:50:21.580757 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6jr6\" (UniqueName: \"kubernetes.io/projected/f23068cc-f12e-482e-894c-701621db18e3-kube-api-access-z6jr6\") on node \"crc\" DevicePath \"\"" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.043961 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" event={"ID":"f23068cc-f12e-482e-894c-701621db18e3","Type":"ContainerDied","Data":"7d604ba05868665133eae25226a985cb53d71c5cfaae9222c3644e7e6611fa34"} Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.044338 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d604ba05868665133eae25226a985cb53d71c5cfaae9222c3644e7e6611fa34" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.044034 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.129661 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q"] Oct 12 20:50:22 crc kubenswrapper[4773]: E1012 20:50:22.130108 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23068cc-f12e-482e-894c-701621db18e3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.130133 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23068cc-f12e-482e-894c-701621db18e3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.130387 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23068cc-f12e-482e-894c-701621db18e3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.131370 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.136008 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.136105 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.144085 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.145659 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.151127 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q"] Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.191785 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzsm\" (UniqueName: \"kubernetes.io/projected/3816a145-333d-4188-9caa-656401fad30d-kube-api-access-rxzsm\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.191847 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.192169 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.293793 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzsm\" (UniqueName: \"kubernetes.io/projected/3816a145-333d-4188-9caa-656401fad30d-kube-api-access-rxzsm\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.293861 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.293937 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.300193 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.310236 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.320784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzsm\" (UniqueName: \"kubernetes.io/projected/3816a145-333d-4188-9caa-656401fad30d-kube-api-access-rxzsm\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.464283 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.503384 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b7606f6-4ebc-4c69-97a2-5311b014e997" path="/var/lib/kubelet/pods/6b7606f6-4ebc-4c69-97a2-5311b014e997/volumes" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.505065 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f29b43-b992-4bd9-8d59-11a564cace05" path="/var/lib/kubelet/pods/93f29b43-b992-4bd9-8d59-11a564cace05/volumes" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.505937 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d" path="/var/lib/kubelet/pods/a006bc6c-fe51-4dc9-b7a5-b028d5a72d8d/volumes" Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.875280 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q"] Oct 12 20:50:22 crc kubenswrapper[4773]: W1012 20:50:22.885759 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3816a145_333d_4188_9caa_656401fad30d.slice/crio-0f3d21ab78352fa2d3286989bf8a8d10b686f1d9f349e95b3204ad992ac6df28 WatchSource:0}: Error finding container 0f3d21ab78352fa2d3286989bf8a8d10b686f1d9f349e95b3204ad992ac6df28: Status 404 returned error can't find the container with id 0f3d21ab78352fa2d3286989bf8a8d10b686f1d9f349e95b3204ad992ac6df28 Oct 12 20:50:22 crc kubenswrapper[4773]: I1012 20:50:22.889159 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 20:50:23 crc kubenswrapper[4773]: I1012 20:50:23.052127 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" event={"ID":"3816a145-333d-4188-9caa-656401fad30d","Type":"ContainerStarted","Data":"0f3d21ab78352fa2d3286989bf8a8d10b686f1d9f349e95b3204ad992ac6df28"} Oct 12 20:50:24 crc kubenswrapper[4773]: I1012 20:50:24.063290 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" event={"ID":"3816a145-333d-4188-9caa-656401fad30d","Type":"ContainerStarted","Data":"14be1e6da6f776fbf58d53dffd2e4bca6aa8d6498d89ef6990980e7c8aa43cae"} Oct 12 20:50:24 crc kubenswrapper[4773]: I1012 20:50:24.087163 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" podStartSLOduration=1.6212844560000002 podStartE2EDuration="2.087118188s" podCreationTimestamp="2025-10-12 20:50:22 +0000 UTC" firstStartedPulling="2025-10-12 20:50:22.888962399 +0000 UTC m=+1571.125260959" lastFinishedPulling="2025-10-12 20:50:23.354796131 +0000 UTC m=+1571.591094691" observedRunningTime="2025-10-12 20:50:24.083392264 +0000 UTC m=+1572.319690834" watchObservedRunningTime="2025-10-12 20:50:24.087118188 +0000 UTC m=+1572.323416768" Oct 12 20:50:28 crc kubenswrapper[4773]: I1012 20:50:28.104969 4773 generic.go:334] "Generic (PLEG): container finished" podID="3816a145-333d-4188-9caa-656401fad30d" containerID="14be1e6da6f776fbf58d53dffd2e4bca6aa8d6498d89ef6990980e7c8aa43cae" exitCode=0 Oct 12 20:50:28 crc kubenswrapper[4773]: I1012 20:50:28.106256 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" event={"ID":"3816a145-333d-4188-9caa-656401fad30d","Type":"ContainerDied","Data":"14be1e6da6f776fbf58d53dffd2e4bca6aa8d6498d89ef6990980e7c8aa43cae"} Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.580279 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.668138 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-ssh-key\") pod \"3816a145-333d-4188-9caa-656401fad30d\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.668289 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-inventory\") pod \"3816a145-333d-4188-9caa-656401fad30d\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.668331 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxzsm\" (UniqueName: \"kubernetes.io/projected/3816a145-333d-4188-9caa-656401fad30d-kube-api-access-rxzsm\") pod \"3816a145-333d-4188-9caa-656401fad30d\" (UID: \"3816a145-333d-4188-9caa-656401fad30d\") " Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.674795 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3816a145-333d-4188-9caa-656401fad30d-kube-api-access-rxzsm" (OuterVolumeSpecName: "kube-api-access-rxzsm") pod "3816a145-333d-4188-9caa-656401fad30d" (UID: "3816a145-333d-4188-9caa-656401fad30d"). InnerVolumeSpecName "kube-api-access-rxzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.698252 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-inventory" (OuterVolumeSpecName: "inventory") pod "3816a145-333d-4188-9caa-656401fad30d" (UID: "3816a145-333d-4188-9caa-656401fad30d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.708432 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3816a145-333d-4188-9caa-656401fad30d" (UID: "3816a145-333d-4188-9caa-656401fad30d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.771877 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.772179 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816a145-333d-4188-9caa-656401fad30d-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:50:29 crc kubenswrapper[4773]: I1012 20:50:29.772301 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxzsm\" (UniqueName: \"kubernetes.io/projected/3816a145-333d-4188-9caa-656401fad30d-kube-api-access-rxzsm\") on node \"crc\" DevicePath \"\"" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.033317 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-93ce-account-create-2vmdf"] Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.042209 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-dls7h"] Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.051101 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-93ce-account-create-2vmdf"] Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.056312 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-056c-account-create-w69zk"] Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.067229 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-dls7h"] Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.075245 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-056c-account-create-w69zk"] Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.082449 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-l87dq"] Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.088705 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-l87dq"] Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.126144 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" event={"ID":"3816a145-333d-4188-9caa-656401fad30d","Type":"ContainerDied","Data":"0f3d21ab78352fa2d3286989bf8a8d10b686f1d9f349e95b3204ad992ac6df28"} Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.126387 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f3d21ab78352fa2d3286989bf8a8d10b686f1d9f349e95b3204ad992ac6df28" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.126203 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.269895 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg"] Oct 12 20:50:30 crc kubenswrapper[4773]: E1012 20:50:30.270930 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3816a145-333d-4188-9caa-656401fad30d" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.270952 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3816a145-333d-4188-9caa-656401fad30d" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.271420 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3816a145-333d-4188-9caa-656401fad30d" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.272488 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.276705 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.277248 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.282134 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.282391 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.303786 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg"] Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.384292 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5968n\" (UniqueName: \"kubernetes.io/projected/0f75579f-926c-4f64-9c96-187d31dbe98a-kube-api-access-5968n\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.384403 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.384429 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.486202 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.486972 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.487185 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5968n\" (UniqueName: \"kubernetes.io/projected/0f75579f-926c-4f64-9c96-187d31dbe98a-kube-api-access-5968n\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.490515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.492437 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.497342 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d290bcb-24cb-4bf6-87ac-da388aca2948" path="/var/lib/kubelet/pods/5d290bcb-24cb-4bf6-87ac-da388aca2948/volumes" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.498675 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de57e72-abb2-4344-a3e3-efa878f91a88" path="/var/lib/kubelet/pods/8de57e72-abb2-4344-a3e3-efa878f91a88/volumes" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.500140 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97cc8725-7f9c-428d-8591-43183469d84b" path="/var/lib/kubelet/pods/97cc8725-7f9c-428d-8591-43183469d84b/volumes" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.501284 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0c6496-049d-4a32-bf75-0c2279256bb8" path="/var/lib/kubelet/pods/ee0c6496-049d-4a32-bf75-0c2279256bb8/volumes" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.506106 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5968n\" (UniqueName: \"kubernetes.io/projected/0f75579f-926c-4f64-9c96-187d31dbe98a-kube-api-access-5968n\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.593158 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:50:30 crc kubenswrapper[4773]: I1012 20:50:30.946844 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg"] Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.134846 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" event={"ID":"0f75579f-926c-4f64-9c96-187d31dbe98a","Type":"ContainerStarted","Data":"d172e0a589ec46a3dd6d66b4ea2aacf58ebcf9609f163e13a7c848954a42e0f4"} Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.733064 4773 scope.go:117] "RemoveContainer" containerID="825074158d43d7622cb800e3a0f8fb953b8c13e08a88cd89e2a28316ab093879" Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.768010 4773 scope.go:117] "RemoveContainer" containerID="f928909672358bcca5c3eb29b474968a73c96b3dba1149d4ea169115ebdc7623" Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.786860 4773 scope.go:117] "RemoveContainer" containerID="3a30840981f02fd0b2ecffb99b8b43305ef025d2611d0dda765881b0f1a47e8d" Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.838520 4773 scope.go:117] "RemoveContainer" containerID="e7b52457b38890c56a01887aa0151d553d4fa6dd83b807f6423c12478a644561" Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.879519 4773 scope.go:117] "RemoveContainer" containerID="55e384668e25bfc563f883f9ba1e2514f006288aa70430fd36ed48d3c4d9e66d" Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.897068 4773 scope.go:117] "RemoveContainer" containerID="494e3b7a7f6f75023818be96a21e5d2a29ae1f72850ce82e85ab4c99b9a719db" Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.943025 4773 scope.go:117] "RemoveContainer" containerID="2beb93ab4a1f4f0a90f30e313d95da7a4165f8a006e592b23e66ed91e27b7c2e" Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.963399 4773 scope.go:117] "RemoveContainer" containerID="df8bfcd302d8db5bb280b9d9a1d3ed49e8a052f4921c14be8b6926bc2e4c0316" Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.980685 4773 scope.go:117] "RemoveContainer" containerID="4c4085a1ba02be8f6748a40bbfd8c6bb8a24468f58938786cdbbc9ad8ffe3c30" Oct 12 20:50:31 crc kubenswrapper[4773]: I1012 20:50:31.998418 4773 scope.go:117] "RemoveContainer" containerID="03dd167bc167fc42badc5400046408efe0f45014cb730618b7ae86e4601709df" Oct 12 20:50:32 crc kubenswrapper[4773]: I1012 20:50:32.019792 4773 scope.go:117] "RemoveContainer" containerID="394cbccec20d06ae5450630aadbd87f5192dcbcc3e819130318b67258e22f447" Oct 12 20:50:32 crc kubenswrapper[4773]: I1012 20:50:32.041655 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-296b-account-create-rjf48"] Oct 12 20:50:32 crc kubenswrapper[4773]: I1012 20:50:32.050405 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-296b-account-create-rjf48"] Oct 12 20:50:32 crc kubenswrapper[4773]: I1012 20:50:32.054022 4773 scope.go:117] "RemoveContainer" containerID="bf8aeee3faec45c8318d83568ea26013fa7e31a4b78280cad3a8607c346c575c" Oct 12 20:50:32 crc kubenswrapper[4773]: I1012 20:50:32.072055 4773 scope.go:117] "RemoveContainer" containerID="f6789c82842b479a0f948ae362c648310cd41aa0ca996e30400d4a69eb2c3a18" Oct 12 20:50:32 crc kubenswrapper[4773]: I1012 20:50:32.165666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" event={"ID":"0f75579f-926c-4f64-9c96-187d31dbe98a","Type":"ContainerStarted","Data":"37503f43cde8b16f5e1e3697155caa5969e9cd108896eb63f0aa389b7012ac71"} Oct 12 20:50:32 crc kubenswrapper[4773]: I1012 20:50:32.187678 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" podStartSLOduration=1.587833638 podStartE2EDuration="2.187662748s" podCreationTimestamp="2025-10-12 20:50:30 +0000 UTC" firstStartedPulling="2025-10-12 20:50:30.957250051 +0000 UTC m=+1579.193548621" lastFinishedPulling="2025-10-12 20:50:31.557079171 +0000 UTC m=+1579.793377731" observedRunningTime="2025-10-12 20:50:32.184045357 +0000 UTC m=+1580.420343917" watchObservedRunningTime="2025-10-12 20:50:32.187662748 +0000 UTC m=+1580.423961308" Oct 12 20:50:32 crc kubenswrapper[4773]: I1012 20:50:32.493704 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47" path="/var/lib/kubelet/pods/c3e09ae6-1c69-4aeb-8c35-1d560c9f7d47/volumes" Oct 12 20:50:34 crc kubenswrapper[4773]: I1012 20:50:34.481846 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:50:34 crc kubenswrapper[4773]: E1012 20:50:34.482509 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:50:45 crc kubenswrapper[4773]: I1012 20:50:45.481165 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:50:45 crc kubenswrapper[4773]: E1012 20:50:45.481992 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:50:58 crc kubenswrapper[4773]: I1012 20:50:58.482201 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:50:58 crc kubenswrapper[4773]: E1012 20:50:58.484884 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:50:59 crc kubenswrapper[4773]: I1012 20:50:59.039917 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wjxlc"] Oct 12 20:50:59 crc kubenswrapper[4773]: I1012 20:50:59.047416 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wjxlc"] Oct 12 20:51:00 crc kubenswrapper[4773]: I1012 20:51:00.496445 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2" path="/var/lib/kubelet/pods/25f6daf1-fb63-4e17-bdcd-6ecc8456c7d2/volumes" Oct 12 20:51:01 crc kubenswrapper[4773]: I1012 20:51:01.051328 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hp22p"] Oct 12 20:51:01 crc kubenswrapper[4773]: I1012 20:51:01.072382 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hp22p"] Oct 12 20:51:02 crc kubenswrapper[4773]: I1012 20:51:02.071305 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-n2ssp"] Oct 12 20:51:02 crc kubenswrapper[4773]: I1012 20:51:02.081641 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lkm6g"] Oct 12 20:51:02 crc kubenswrapper[4773]: I1012 20:51:02.092961 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-n2ssp"] Oct 12 20:51:02 crc kubenswrapper[4773]: I1012 20:51:02.100274 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lkm6g"] Oct 12 20:51:02 crc kubenswrapper[4773]: I1012 20:51:02.496211 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33978648-4803-4e0c-9cba-d75359e55bcd" path="/var/lib/kubelet/pods/33978648-4803-4e0c-9cba-d75359e55bcd/volumes" Oct 12 20:51:02 crc kubenswrapper[4773]: I1012 20:51:02.497925 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="601a7ec1-3c5d-4f00-8417-5fb3ee299e8a" path="/var/lib/kubelet/pods/601a7ec1-3c5d-4f00-8417-5fb3ee299e8a/volumes" Oct 12 20:51:02 crc kubenswrapper[4773]: I1012 20:51:02.499431 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0fc5b21-582d-4e16-849d-050425c5482a" path="/var/lib/kubelet/pods/f0fc5b21-582d-4e16-849d-050425c5482a/volumes" Oct 12 20:51:13 crc kubenswrapper[4773]: I1012 20:51:13.482316 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:51:13 crc kubenswrapper[4773]: E1012 20:51:13.482931 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:51:19 crc kubenswrapper[4773]: I1012 20:51:19.032868 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qs9dw"] Oct 12 20:51:19 crc kubenswrapper[4773]: I1012 20:51:19.042607 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qs9dw"] Oct 12 20:51:20 crc kubenswrapper[4773]: I1012 20:51:20.502048 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06c5679-8ddf-4043-b3ae-4fd8986c4483" path="/var/lib/kubelet/pods/d06c5679-8ddf-4043-b3ae-4fd8986c4483/volumes" Oct 12 20:51:25 crc kubenswrapper[4773]: I1012 20:51:25.481643 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:51:25 crc kubenswrapper[4773]: E1012 20:51:25.482986 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:51:32 crc kubenswrapper[4773]: I1012 20:51:32.321180 4773 scope.go:117] "RemoveContainer" containerID="64cb79a506d6d49a490b14338ea8bc58c004c5ba6aa5a2bd60b0acc1e3f3721d" Oct 12 20:51:32 crc kubenswrapper[4773]: I1012 20:51:32.415355 4773 scope.go:117] "RemoveContainer" containerID="dd19f0ea5fee38b8b8c44436a3586c627c7e813c276efd3de25295819ab15c3c" Oct 12 20:51:32 crc kubenswrapper[4773]: I1012 20:51:32.474135 4773 scope.go:117] "RemoveContainer" containerID="80f8f1239f22eb5d2f52a46f3a64790e91aec507be8712d95563ab4bb88b8e81" Oct 12 20:51:32 crc kubenswrapper[4773]: I1012 20:51:32.513058 4773 scope.go:117] "RemoveContainer" containerID="ba61c2e034916e80392e9a0da9e27d475fc3a5c422de8875975b995f04107aa2" Oct 12 20:51:32 crc kubenswrapper[4773]: I1012 20:51:32.554768 4773 scope.go:117] "RemoveContainer" containerID="185670ff40c99036f948026f9a33d09d50c600480f3418a5fb4a9b3a0b231e9a" Oct 12 20:51:32 crc kubenswrapper[4773]: I1012 20:51:32.583937 4773 scope.go:117] "RemoveContainer" containerID="bad9cc052766a1613855b92aab06323d45ecae86387ebe6845468ae2d3c5d04e" Oct 12 20:51:33 crc kubenswrapper[4773]: I1012 20:51:33.767128 4773 generic.go:334] "Generic (PLEG): container finished" podID="0f75579f-926c-4f64-9c96-187d31dbe98a" containerID="37503f43cde8b16f5e1e3697155caa5969e9cd108896eb63f0aa389b7012ac71" exitCode=2 Oct 12 20:51:33 crc kubenswrapper[4773]: I1012 20:51:33.767215 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" event={"ID":"0f75579f-926c-4f64-9c96-187d31dbe98a","Type":"ContainerDied","Data":"37503f43cde8b16f5e1e3697155caa5969e9cd108896eb63f0aa389b7012ac71"} Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.231004 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.323525 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5968n\" (UniqueName: \"kubernetes.io/projected/0f75579f-926c-4f64-9c96-187d31dbe98a-kube-api-access-5968n\") pod \"0f75579f-926c-4f64-9c96-187d31dbe98a\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.324004 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-ssh-key\") pod \"0f75579f-926c-4f64-9c96-187d31dbe98a\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.324278 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-inventory\") pod \"0f75579f-926c-4f64-9c96-187d31dbe98a\" (UID: \"0f75579f-926c-4f64-9c96-187d31dbe98a\") " Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.331675 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f75579f-926c-4f64-9c96-187d31dbe98a-kube-api-access-5968n" (OuterVolumeSpecName: "kube-api-access-5968n") pod "0f75579f-926c-4f64-9c96-187d31dbe98a" (UID: "0f75579f-926c-4f64-9c96-187d31dbe98a"). InnerVolumeSpecName "kube-api-access-5968n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.349029 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0f75579f-926c-4f64-9c96-187d31dbe98a" (UID: "0f75579f-926c-4f64-9c96-187d31dbe98a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.349100 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-inventory" (OuterVolumeSpecName: "inventory") pod "0f75579f-926c-4f64-9c96-187d31dbe98a" (UID: "0f75579f-926c-4f64-9c96-187d31dbe98a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.426707 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5968n\" (UniqueName: \"kubernetes.io/projected/0f75579f-926c-4f64-9c96-187d31dbe98a-kube-api-access-5968n\") on node \"crc\" DevicePath \"\"" Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.426756 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.426765 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f75579f-926c-4f64-9c96-187d31dbe98a-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.789476 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" event={"ID":"0f75579f-926c-4f64-9c96-187d31dbe98a","Type":"ContainerDied","Data":"d172e0a589ec46a3dd6d66b4ea2aacf58ebcf9609f163e13a7c848954a42e0f4"} Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.789536 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d172e0a589ec46a3dd6d66b4ea2aacf58ebcf9609f163e13a7c848954a42e0f4" Oct 12 20:51:35 crc kubenswrapper[4773]: I1012 20:51:35.789610 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg" Oct 12 20:51:40 crc kubenswrapper[4773]: I1012 20:51:40.482329 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:51:40 crc kubenswrapper[4773]: E1012 20:51:40.483988 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.037956 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc"] Oct 12 20:51:42 crc kubenswrapper[4773]: E1012 20:51:42.038689 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f75579f-926c-4f64-9c96-187d31dbe98a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.038703 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f75579f-926c-4f64-9c96-187d31dbe98a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.038875 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f75579f-926c-4f64-9c96-187d31dbe98a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.039478 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.042557 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.042939 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.043054 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.042979 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.054679 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc"] Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.075044 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97flr\" (UniqueName: \"kubernetes.io/projected/17eb0758-b5c9-4f05-8208-b105edc12dc9-kube-api-access-97flr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p27qc\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.075110 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p27qc\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.075132 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p27qc\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.176632 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97flr\" (UniqueName: \"kubernetes.io/projected/17eb0758-b5c9-4f05-8208-b105edc12dc9-kube-api-access-97flr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p27qc\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.176785 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p27qc\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.176824 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p27qc\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.197280 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p27qc\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.202103 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p27qc\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.218689 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97flr\" (UniqueName: \"kubernetes.io/projected/17eb0758-b5c9-4f05-8208-b105edc12dc9-kube-api-access-97flr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p27qc\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.414049 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:51:42 crc kubenswrapper[4773]: I1012 20:51:42.931429 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc"] Oct 12 20:51:43 crc kubenswrapper[4773]: I1012 20:51:43.883648 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" event={"ID":"17eb0758-b5c9-4f05-8208-b105edc12dc9","Type":"ContainerStarted","Data":"87c278063fb752efb628afa1937897c32526a5cfafa8c887ed8e0fa2d787e91d"} Oct 12 20:51:43 crc kubenswrapper[4773]: I1012 20:51:43.885244 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" event={"ID":"17eb0758-b5c9-4f05-8208-b105edc12dc9","Type":"ContainerStarted","Data":"d566174319b4e9624e317e8e0f46a1d0b029efd7b91b6a41834ef8d1c5ea6fcf"} Oct 12 20:51:43 crc kubenswrapper[4773]: I1012 20:51:43.906241 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" podStartSLOduration=1.461430604 podStartE2EDuration="1.906224402s" podCreationTimestamp="2025-10-12 20:51:42 +0000 UTC" firstStartedPulling="2025-10-12 20:51:42.93508067 +0000 UTC m=+1651.171379240" lastFinishedPulling="2025-10-12 20:51:43.379874438 +0000 UTC m=+1651.616173038" observedRunningTime="2025-10-12 20:51:43.902158069 +0000 UTC m=+1652.138456649" watchObservedRunningTime="2025-10-12 20:51:43.906224402 +0000 UTC m=+1652.142522962" Oct 12 20:51:51 crc kubenswrapper[4773]: I1012 20:51:51.042216 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-97sg9"] Oct 12 20:51:51 crc kubenswrapper[4773]: I1012 20:51:51.050519 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-97sg9"] Oct 12 20:51:52 crc kubenswrapper[4773]: I1012 20:51:52.034184 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-hk2db"] Oct 12 20:51:52 crc kubenswrapper[4773]: I1012 20:51:52.042675 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lqr4m"] Oct 12 20:51:52 crc kubenswrapper[4773]: I1012 20:51:52.052610 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-hk2db"] Oct 12 20:51:52 crc kubenswrapper[4773]: I1012 20:51:52.059657 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lqr4m"] Oct 12 20:51:52 crc kubenswrapper[4773]: I1012 20:51:52.492981 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1950e421-2f64-4dd1-ba88-acb86d2921dc" path="/var/lib/kubelet/pods/1950e421-2f64-4dd1-ba88-acb86d2921dc/volumes" Oct 12 20:51:52 crc kubenswrapper[4773]: I1012 20:51:52.493752 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c987c62-7e44-493e-b3ad-a23e02081559" path="/var/lib/kubelet/pods/5c987c62-7e44-493e-b3ad-a23e02081559/volumes" Oct 12 20:51:52 crc kubenswrapper[4773]: I1012 20:51:52.494236 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf26e17-d821-48a7-8b36-55b6c8616c22" path="/var/lib/kubelet/pods/6cf26e17-d821-48a7-8b36-55b6c8616c22/volumes" Oct 12 20:51:55 crc kubenswrapper[4773]: I1012 20:51:55.481996 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:51:55 crc kubenswrapper[4773]: E1012 20:51:55.482688 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:51:57 crc kubenswrapper[4773]: I1012 20:51:57.031141 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-479b-account-create-s6gs5"] Oct 12 20:51:57 crc kubenswrapper[4773]: I1012 20:51:57.043677 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-479b-account-create-s6gs5"] Oct 12 20:51:58 crc kubenswrapper[4773]: I1012 20:51:58.031039 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-50ec-account-create-gk4vd"] Oct 12 20:51:58 crc kubenswrapper[4773]: I1012 20:51:58.039685 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-49ba-account-create-5892n"] Oct 12 20:51:58 crc kubenswrapper[4773]: I1012 20:51:58.047059 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-50ec-account-create-gk4vd"] Oct 12 20:51:58 crc kubenswrapper[4773]: I1012 20:51:58.052388 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-49ba-account-create-5892n"] Oct 12 20:51:58 crc kubenswrapper[4773]: I1012 20:51:58.494547 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d550930-e1d2-4785-bf9d-70959c014912" path="/var/lib/kubelet/pods/6d550930-e1d2-4785-bf9d-70959c014912/volumes" Oct 12 20:51:58 crc kubenswrapper[4773]: I1012 20:51:58.495853 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f59080-ff5e-44d1-a7a3-2b8642c8d883" path="/var/lib/kubelet/pods/e1f59080-ff5e-44d1-a7a3-2b8642c8d883/volumes" Oct 12 20:51:58 crc kubenswrapper[4773]: I1012 20:51:58.497220 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8513a2-9bb8-4b6a-892e-d77814d7ad9a" path="/var/lib/kubelet/pods/fa8513a2-9bb8-4b6a-892e-d77814d7ad9a/volumes" Oct 12 20:52:10 crc kubenswrapper[4773]: I1012 20:52:10.485641 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:52:10 crc kubenswrapper[4773]: E1012 20:52:10.486797 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:52:23 crc kubenswrapper[4773]: I1012 20:52:23.061798 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkjgn"] Oct 12 20:52:23 crc kubenswrapper[4773]: I1012 20:52:23.068667 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kkjgn"] Oct 12 20:52:23 crc kubenswrapper[4773]: I1012 20:52:23.481856 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:52:23 crc kubenswrapper[4773]: E1012 20:52:23.482328 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:52:24 crc kubenswrapper[4773]: I1012 20:52:24.492382 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3655b0f6-2e88-4b9e-b836-18633f2f0535" path="/var/lib/kubelet/pods/3655b0f6-2e88-4b9e-b836-18633f2f0535/volumes" Oct 12 20:52:32 crc kubenswrapper[4773]: I1012 20:52:32.762773 4773 scope.go:117] "RemoveContainer" containerID="09dcec4d009944e7a5b91bfa992f2039851a8a0bb30860d1759a3ee2a0996d3e" Oct 12 20:52:32 crc kubenswrapper[4773]: I1012 20:52:32.783088 4773 scope.go:117] "RemoveContainer" containerID="e0a97b4c2a394f1f9bb07b083f66f1289ba13399d3951eedc7d428b3fa435a3a" Oct 12 20:52:32 crc kubenswrapper[4773]: I1012 20:52:32.854961 4773 scope.go:117] "RemoveContainer" containerID="af98ce080a1cdf129b9e5aed28ba6618f3e37052f0aa0db4e23a3fd0daa75ce7" Oct 12 20:52:32 crc kubenswrapper[4773]: I1012 20:52:32.878044 4773 scope.go:117] "RemoveContainer" containerID="6c94aa746bfcb6747dca20641f2630d1045e2c3e29d4b6c944d5a97b5887090e" Oct 12 20:52:32 crc kubenswrapper[4773]: I1012 20:52:32.937025 4773 scope.go:117] "RemoveContainer" containerID="4db74c28f0b720521375ec3574e54e8c2705fd71f170c06cebb52477d8d70b6b" Oct 12 20:52:32 crc kubenswrapper[4773]: I1012 20:52:32.978340 4773 scope.go:117] "RemoveContainer" containerID="50792f005db49478681242fb97b47cdb8c40d62b068a9cf0b080904cdb0923cf" Oct 12 20:52:33 crc kubenswrapper[4773]: I1012 20:52:33.017455 4773 scope.go:117] "RemoveContainer" containerID="8243b170b608c1ea3f5f21efcfa797fbd22df9ee249a4262b81dfd80fea67fad" Oct 12 20:52:36 crc kubenswrapper[4773]: I1012 20:52:36.423141 4773 generic.go:334] "Generic (PLEG): container finished" podID="17eb0758-b5c9-4f05-8208-b105edc12dc9" containerID="87c278063fb752efb628afa1937897c32526a5cfafa8c887ed8e0fa2d787e91d" exitCode=0 Oct 12 20:52:36 crc kubenswrapper[4773]: I1012 20:52:36.423243 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" event={"ID":"17eb0758-b5c9-4f05-8208-b105edc12dc9","Type":"ContainerDied","Data":"87c278063fb752efb628afa1937897c32526a5cfafa8c887ed8e0fa2d787e91d"} Oct 12 20:52:37 crc kubenswrapper[4773]: I1012 20:52:37.481016 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:52:37 crc kubenswrapper[4773]: E1012 20:52:37.481467 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:52:37 crc kubenswrapper[4773]: I1012 20:52:37.818952 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.012672 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-ssh-key\") pod \"17eb0758-b5c9-4f05-8208-b105edc12dc9\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.012834 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97flr\" (UniqueName: \"kubernetes.io/projected/17eb0758-b5c9-4f05-8208-b105edc12dc9-kube-api-access-97flr\") pod \"17eb0758-b5c9-4f05-8208-b105edc12dc9\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.013008 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-inventory\") pod \"17eb0758-b5c9-4f05-8208-b105edc12dc9\" (UID: \"17eb0758-b5c9-4f05-8208-b105edc12dc9\") " Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.021823 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17eb0758-b5c9-4f05-8208-b105edc12dc9-kube-api-access-97flr" (OuterVolumeSpecName: "kube-api-access-97flr") pod "17eb0758-b5c9-4f05-8208-b105edc12dc9" (UID: "17eb0758-b5c9-4f05-8208-b105edc12dc9"). InnerVolumeSpecName "kube-api-access-97flr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.038089 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-inventory" (OuterVolumeSpecName: "inventory") pod "17eb0758-b5c9-4f05-8208-b105edc12dc9" (UID: "17eb0758-b5c9-4f05-8208-b105edc12dc9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.043533 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17eb0758-b5c9-4f05-8208-b105edc12dc9" (UID: "17eb0758-b5c9-4f05-8208-b105edc12dc9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.115254 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97flr\" (UniqueName: \"kubernetes.io/projected/17eb0758-b5c9-4f05-8208-b105edc12dc9-kube-api-access-97flr\") on node \"crc\" DevicePath \"\"" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.115309 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.115328 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17eb0758-b5c9-4f05-8208-b105edc12dc9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.445563 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" event={"ID":"17eb0758-b5c9-4f05-8208-b105edc12dc9","Type":"ContainerDied","Data":"d566174319b4e9624e317e8e0f46a1d0b029efd7b91b6a41834ef8d1c5ea6fcf"} Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.445828 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d566174319b4e9624e317e8e0f46a1d0b029efd7b91b6a41834ef8d1c5ea6fcf" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.445855 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.527631 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jjjj2"] Oct 12 20:52:38 crc kubenswrapper[4773]: E1012 20:52:38.528006 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17eb0758-b5c9-4f05-8208-b105edc12dc9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.528019 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="17eb0758-b5c9-4f05-8208-b105edc12dc9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.528192 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="17eb0758-b5c9-4f05-8208-b105edc12dc9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.528754 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.532211 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.532640 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.532879 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.533902 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.549763 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jjjj2"] Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.725783 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxmgf\" (UniqueName: \"kubernetes.io/projected/9129ea8a-c6c0-4ec6-9667-a3513b897a45-kube-api-access-lxmgf\") pod \"ssh-known-hosts-edpm-deployment-jjjj2\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.726233 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jjjj2\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.726542 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jjjj2\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.828090 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxmgf\" (UniqueName: \"kubernetes.io/projected/9129ea8a-c6c0-4ec6-9667-a3513b897a45-kube-api-access-lxmgf\") pod \"ssh-known-hosts-edpm-deployment-jjjj2\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.828151 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jjjj2\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.828242 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jjjj2\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.832531 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jjjj2\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.833895 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jjjj2\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:38 crc kubenswrapper[4773]: I1012 20:52:38.854945 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxmgf\" (UniqueName: \"kubernetes.io/projected/9129ea8a-c6c0-4ec6-9667-a3513b897a45-kube-api-access-lxmgf\") pod \"ssh-known-hosts-edpm-deployment-jjjj2\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:39 crc kubenswrapper[4773]: I1012 20:52:39.143629 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:39 crc kubenswrapper[4773]: I1012 20:52:39.768459 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jjjj2"] Oct 12 20:52:40 crc kubenswrapper[4773]: I1012 20:52:40.469196 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" event={"ID":"9129ea8a-c6c0-4ec6-9667-a3513b897a45","Type":"ContainerStarted","Data":"809aa97af8b4d7d7c03a29e3d9cc601536f0a1557c94ee761e74b7955deb4768"} Oct 12 20:52:41 crc kubenswrapper[4773]: I1012 20:52:41.480824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" event={"ID":"9129ea8a-c6c0-4ec6-9667-a3513b897a45","Type":"ContainerStarted","Data":"d8cdee227a7b62dd4b9180ced6e129799a991f6816eed5b5bc3312af5bc54ac7"} Oct 12 20:52:41 crc kubenswrapper[4773]: I1012 20:52:41.498895 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" podStartSLOduration=2.9674832110000002 podStartE2EDuration="3.498869976s" podCreationTimestamp="2025-10-12 20:52:38 +0000 UTC" firstStartedPulling="2025-10-12 20:52:39.772678339 +0000 UTC m=+1708.008976899" lastFinishedPulling="2025-10-12 20:52:40.304065104 +0000 UTC m=+1708.540363664" observedRunningTime="2025-10-12 20:52:41.497206499 +0000 UTC m=+1709.733505059" watchObservedRunningTime="2025-10-12 20:52:41.498869976 +0000 UTC m=+1709.735168546" Oct 12 20:52:46 crc kubenswrapper[4773]: I1012 20:52:46.063002 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wghfz"] Oct 12 20:52:46 crc kubenswrapper[4773]: I1012 20:52:46.072329 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5qcs"] Oct 12 20:52:46 crc kubenswrapper[4773]: I1012 20:52:46.084326 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wghfz"] Oct 12 20:52:46 crc kubenswrapper[4773]: I1012 20:52:46.092092 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5qcs"] Oct 12 20:52:46 crc kubenswrapper[4773]: I1012 20:52:46.493411 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2367dd99-7d9e-412e-a065-a10039761c38" path="/var/lib/kubelet/pods/2367dd99-7d9e-412e-a065-a10039761c38/volumes" Oct 12 20:52:46 crc kubenswrapper[4773]: I1012 20:52:46.494636 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58145541-18b5-45d4-a92e-646ac7ef7961" path="/var/lib/kubelet/pods/58145541-18b5-45d4-a92e-646ac7ef7961/volumes" Oct 12 20:52:48 crc kubenswrapper[4773]: I1012 20:52:48.537005 4773 generic.go:334] "Generic (PLEG): container finished" podID="9129ea8a-c6c0-4ec6-9667-a3513b897a45" containerID="d8cdee227a7b62dd4b9180ced6e129799a991f6816eed5b5bc3312af5bc54ac7" exitCode=0 Oct 12 20:52:48 crc kubenswrapper[4773]: I1012 20:52:48.537066 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" event={"ID":"9129ea8a-c6c0-4ec6-9667-a3513b897a45","Type":"ContainerDied","Data":"d8cdee227a7b62dd4b9180ced6e129799a991f6816eed5b5bc3312af5bc54ac7"} Oct 12 20:52:49 crc kubenswrapper[4773]: I1012 20:52:49.481500 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:52:49 crc kubenswrapper[4773]: E1012 20:52:49.482068 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:52:49 crc kubenswrapper[4773]: I1012 20:52:49.984941 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.143088 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-ssh-key-openstack-edpm-ipam\") pod \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.143189 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-inventory-0\") pod \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.143432 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxmgf\" (UniqueName: \"kubernetes.io/projected/9129ea8a-c6c0-4ec6-9667-a3513b897a45-kube-api-access-lxmgf\") pod \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\" (UID: \"9129ea8a-c6c0-4ec6-9667-a3513b897a45\") " Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.155040 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9129ea8a-c6c0-4ec6-9667-a3513b897a45-kube-api-access-lxmgf" (OuterVolumeSpecName: "kube-api-access-lxmgf") pod "9129ea8a-c6c0-4ec6-9667-a3513b897a45" (UID: "9129ea8a-c6c0-4ec6-9667-a3513b897a45"). InnerVolumeSpecName "kube-api-access-lxmgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.169427 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "9129ea8a-c6c0-4ec6-9667-a3513b897a45" (UID: "9129ea8a-c6c0-4ec6-9667-a3513b897a45"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.178479 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9129ea8a-c6c0-4ec6-9667-a3513b897a45" (UID: "9129ea8a-c6c0-4ec6-9667-a3513b897a45"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.250257 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.250307 4773 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9129ea8a-c6c0-4ec6-9667-a3513b897a45-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.250319 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxmgf\" (UniqueName: \"kubernetes.io/projected/9129ea8a-c6c0-4ec6-9667-a3513b897a45-kube-api-access-lxmgf\") on node \"crc\" DevicePath \"\"" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.563450 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" event={"ID":"9129ea8a-c6c0-4ec6-9667-a3513b897a45","Type":"ContainerDied","Data":"809aa97af8b4d7d7c03a29e3d9cc601536f0a1557c94ee761e74b7955deb4768"} Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.563991 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="809aa97af8b4d7d7c03a29e3d9cc601536f0a1557c94ee761e74b7955deb4768" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.563498 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jjjj2" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.634591 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl"] Oct 12 20:52:50 crc kubenswrapper[4773]: E1012 20:52:50.635090 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9129ea8a-c6c0-4ec6-9667-a3513b897a45" containerName="ssh-known-hosts-edpm-deployment" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.635116 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9129ea8a-c6c0-4ec6-9667-a3513b897a45" containerName="ssh-known-hosts-edpm-deployment" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.635604 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9129ea8a-c6c0-4ec6-9667-a3513b897a45" containerName="ssh-known-hosts-edpm-deployment" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.636389 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.638736 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.638838 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.639464 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.656667 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.662785 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl"] Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.760853 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmc7k\" (UniqueName: \"kubernetes.io/projected/a82d170b-f36c-4d2e-b522-99c93c9d2685-kube-api-access-nmc7k\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2d4fl\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.760926 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2d4fl\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.760954 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2d4fl\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.897760 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmc7k\" (UniqueName: \"kubernetes.io/projected/a82d170b-f36c-4d2e-b522-99c93c9d2685-kube-api-access-nmc7k\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2d4fl\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.897881 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2d4fl\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.897929 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2d4fl\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.905683 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2d4fl\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.906298 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2d4fl\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.915901 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmc7k\" (UniqueName: \"kubernetes.io/projected/a82d170b-f36c-4d2e-b522-99c93c9d2685-kube-api-access-nmc7k\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2d4fl\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:50 crc kubenswrapper[4773]: I1012 20:52:50.961781 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:52:51 crc kubenswrapper[4773]: I1012 20:52:51.500861 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl"] Oct 12 20:52:51 crc kubenswrapper[4773]: I1012 20:52:51.570861 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" event={"ID":"a82d170b-f36c-4d2e-b522-99c93c9d2685","Type":"ContainerStarted","Data":"0957d461842a239e37068e553ccdc280720b30a618e2fc322b88ec22cc6caddd"} Oct 12 20:52:52 crc kubenswrapper[4773]: I1012 20:52:52.581553 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" event={"ID":"a82d170b-f36c-4d2e-b522-99c93c9d2685","Type":"ContainerStarted","Data":"0f69f16f07254d1fa3d64642fbb929966189005499c2ab1bb5da4b62b94b42a1"} Oct 12 20:53:01 crc kubenswrapper[4773]: I1012 20:53:01.669142 4773 generic.go:334] "Generic (PLEG): container finished" podID="a82d170b-f36c-4d2e-b522-99c93c9d2685" containerID="0f69f16f07254d1fa3d64642fbb929966189005499c2ab1bb5da4b62b94b42a1" exitCode=0 Oct 12 20:53:01 crc kubenswrapper[4773]: I1012 20:53:01.669360 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" event={"ID":"a82d170b-f36c-4d2e-b522-99c93c9d2685","Type":"ContainerDied","Data":"0f69f16f07254d1fa3d64642fbb929966189005499c2ab1bb5da4b62b94b42a1"} Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.145147 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.262387 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-ssh-key\") pod \"a82d170b-f36c-4d2e-b522-99c93c9d2685\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.262495 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmc7k\" (UniqueName: \"kubernetes.io/projected/a82d170b-f36c-4d2e-b522-99c93c9d2685-kube-api-access-nmc7k\") pod \"a82d170b-f36c-4d2e-b522-99c93c9d2685\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.262599 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-inventory\") pod \"a82d170b-f36c-4d2e-b522-99c93c9d2685\" (UID: \"a82d170b-f36c-4d2e-b522-99c93c9d2685\") " Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.274167 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a82d170b-f36c-4d2e-b522-99c93c9d2685-kube-api-access-nmc7k" (OuterVolumeSpecName: "kube-api-access-nmc7k") pod "a82d170b-f36c-4d2e-b522-99c93c9d2685" (UID: "a82d170b-f36c-4d2e-b522-99c93c9d2685"). InnerVolumeSpecName "kube-api-access-nmc7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.296383 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-inventory" (OuterVolumeSpecName: "inventory") pod "a82d170b-f36c-4d2e-b522-99c93c9d2685" (UID: "a82d170b-f36c-4d2e-b522-99c93c9d2685"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.297314 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a82d170b-f36c-4d2e-b522-99c93c9d2685" (UID: "a82d170b-f36c-4d2e-b522-99c93c9d2685"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.369427 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.369473 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmc7k\" (UniqueName: \"kubernetes.io/projected/a82d170b-f36c-4d2e-b522-99c93c9d2685-kube-api-access-nmc7k\") on node \"crc\" DevicePath \"\"" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.369494 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a82d170b-f36c-4d2e-b522-99c93c9d2685-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.689909 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" event={"ID":"a82d170b-f36c-4d2e-b522-99c93c9d2685","Type":"ContainerDied","Data":"0957d461842a239e37068e553ccdc280720b30a618e2fc322b88ec22cc6caddd"} Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.689947 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0957d461842a239e37068e553ccdc280720b30a618e2fc322b88ec22cc6caddd" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.690026 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.788991 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w"] Oct 12 20:53:03 crc kubenswrapper[4773]: E1012 20:53:03.789856 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82d170b-f36c-4d2e-b522-99c93c9d2685" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.789949 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82d170b-f36c-4d2e-b522-99c93c9d2685" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.790243 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a82d170b-f36c-4d2e-b522-99c93c9d2685" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.791143 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.793585 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.794025 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.793689 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.793790 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.801804 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w"] Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.877521 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.877590 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.877622 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2m86\" (UniqueName: \"kubernetes.io/projected/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-kube-api-access-h2m86\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.979106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.980117 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.980277 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2m86\" (UniqueName: \"kubernetes.io/projected/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-kube-api-access-h2m86\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.991878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.994262 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:03 crc kubenswrapper[4773]: I1012 20:53:03.995836 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2m86\" (UniqueName: \"kubernetes.io/projected/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-kube-api-access-h2m86\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:04 crc kubenswrapper[4773]: I1012 20:53:04.115650 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:04 crc kubenswrapper[4773]: I1012 20:53:04.481404 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:53:04 crc kubenswrapper[4773]: E1012 20:53:04.481999 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:53:04 crc kubenswrapper[4773]: I1012 20:53:04.635442 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w"] Oct 12 20:53:04 crc kubenswrapper[4773]: I1012 20:53:04.700209 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" event={"ID":"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5","Type":"ContainerStarted","Data":"f97e5d1d8b1dab1ec0f217c30fbc42be9efb0cc761a37f5266b99b2b15c9a109"} Oct 12 20:53:06 crc kubenswrapper[4773]: I1012 20:53:06.721978 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" event={"ID":"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5","Type":"ContainerStarted","Data":"0065fb1508276a0a3e128ec33a9383012dd6f2e4e90616d24d804c66caccccc2"} Oct 12 20:53:15 crc kubenswrapper[4773]: I1012 20:53:15.799477 4773 generic.go:334] "Generic (PLEG): container finished" podID="076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5" containerID="0065fb1508276a0a3e128ec33a9383012dd6f2e4e90616d24d804c66caccccc2" exitCode=0 Oct 12 20:53:15 crc kubenswrapper[4773]: I1012 20:53:15.799960 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" event={"ID":"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5","Type":"ContainerDied","Data":"0065fb1508276a0a3e128ec33a9383012dd6f2e4e90616d24d804c66caccccc2"} Oct 12 20:53:16 crc kubenswrapper[4773]: I1012 20:53:16.482435 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:53:16 crc kubenswrapper[4773]: E1012 20:53:16.483256 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.262524 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.463542 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-ssh-key\") pod \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.464240 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-inventory\") pod \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.464290 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2m86\" (UniqueName: \"kubernetes.io/projected/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-kube-api-access-h2m86\") pod \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\" (UID: \"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5\") " Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.472673 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-kube-api-access-h2m86" (OuterVolumeSpecName: "kube-api-access-h2m86") pod "076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5" (UID: "076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5"). InnerVolumeSpecName "kube-api-access-h2m86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.495042 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-inventory" (OuterVolumeSpecName: "inventory") pod "076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5" (UID: "076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.499994 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5" (UID: "076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.566984 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.567024 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2m86\" (UniqueName: \"kubernetes.io/projected/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-kube-api-access-h2m86\") on node \"crc\" DevicePath \"\"" Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.567037 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.822412 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" event={"ID":"076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5","Type":"ContainerDied","Data":"f97e5d1d8b1dab1ec0f217c30fbc42be9efb0cc761a37f5266b99b2b15c9a109"} Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.822463 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f97e5d1d8b1dab1ec0f217c30fbc42be9efb0cc761a37f5266b99b2b15c9a109" Oct 12 20:53:17 crc kubenswrapper[4773]: I1012 20:53:17.822474 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w" Oct 12 20:53:31 crc kubenswrapper[4773]: I1012 20:53:31.042619 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vhgr7"] Oct 12 20:53:31 crc kubenswrapper[4773]: I1012 20:53:31.052562 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vhgr7"] Oct 12 20:53:31 crc kubenswrapper[4773]: I1012 20:53:31.481280 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:53:31 crc kubenswrapper[4773]: E1012 20:53:31.481571 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:53:32 crc kubenswrapper[4773]: I1012 20:53:32.499336 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439aaeb2-8cab-4025-a89b-33fda13b4c5d" path="/var/lib/kubelet/pods/439aaeb2-8cab-4025-a89b-33fda13b4c5d/volumes" Oct 12 20:53:33 crc kubenswrapper[4773]: I1012 20:53:33.131660 4773 scope.go:117] "RemoveContainer" containerID="a03922c4fbafa82c4fc6237c18b8c45736b72024be3ad3312fa40f232f9c2319" Oct 12 20:53:33 crc kubenswrapper[4773]: I1012 20:53:33.182490 4773 scope.go:117] "RemoveContainer" containerID="63728d2f1ddff77313bc16ed7b29c60dd1cf7a896d5e0ae60b66500db4ef5f25" Oct 12 20:53:33 crc kubenswrapper[4773]: I1012 20:53:33.250076 4773 scope.go:117] "RemoveContainer" containerID="79e2214c3d6be752386f2526678055c93984630918e896f4fc108121fe57e511" Oct 12 20:53:44 crc kubenswrapper[4773]: I1012 20:53:44.482493 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:53:44 crc kubenswrapper[4773]: E1012 20:53:44.483878 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:53:55 crc kubenswrapper[4773]: I1012 20:53:55.481859 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:53:55 crc kubenswrapper[4773]: E1012 20:53:55.482621 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:54:07 crc kubenswrapper[4773]: I1012 20:54:07.481458 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:54:07 crc kubenswrapper[4773]: E1012 20:54:07.482376 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.738057 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w9d4h"] Oct 12 20:54:08 crc kubenswrapper[4773]: E1012 20:54:08.738470 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.738484 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.738667 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.740061 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.756694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w9d4h"] Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.858200 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgrrs\" (UniqueName: \"kubernetes.io/projected/8e93a34f-e7eb-4209-8b53-8f40697a55c8-kube-api-access-xgrrs\") pod \"redhat-operators-w9d4h\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.858260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-utilities\") pod \"redhat-operators-w9d4h\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.858363 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-catalog-content\") pod \"redhat-operators-w9d4h\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.960424 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-catalog-content\") pod \"redhat-operators-w9d4h\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.960864 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-utilities\") pod \"redhat-operators-w9d4h\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.961053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrrs\" (UniqueName: \"kubernetes.io/projected/8e93a34f-e7eb-4209-8b53-8f40697a55c8-kube-api-access-xgrrs\") pod \"redhat-operators-w9d4h\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.960940 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-catalog-content\") pod \"redhat-operators-w9d4h\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.961233 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-utilities\") pod \"redhat-operators-w9d4h\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:08 crc kubenswrapper[4773]: I1012 20:54:08.984191 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgrrs\" (UniqueName: \"kubernetes.io/projected/8e93a34f-e7eb-4209-8b53-8f40697a55c8-kube-api-access-xgrrs\") pod \"redhat-operators-w9d4h\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:09 crc kubenswrapper[4773]: I1012 20:54:09.065471 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:09 crc kubenswrapper[4773]: I1012 20:54:09.541169 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w9d4h"] Oct 12 20:54:10 crc kubenswrapper[4773]: I1012 20:54:10.283708 4773 generic.go:334] "Generic (PLEG): container finished" podID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerID="45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa" exitCode=0 Oct 12 20:54:10 crc kubenswrapper[4773]: I1012 20:54:10.283864 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9d4h" event={"ID":"8e93a34f-e7eb-4209-8b53-8f40697a55c8","Type":"ContainerDied","Data":"45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa"} Oct 12 20:54:10 crc kubenswrapper[4773]: I1012 20:54:10.283895 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9d4h" event={"ID":"8e93a34f-e7eb-4209-8b53-8f40697a55c8","Type":"ContainerStarted","Data":"b50fe93314a1c00d9f4b0aae61e6faa5d93c4a2ba6507fb8c54197972adc14ba"} Oct 12 20:54:11 crc kubenswrapper[4773]: I1012 20:54:11.292916 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9d4h" event={"ID":"8e93a34f-e7eb-4209-8b53-8f40697a55c8","Type":"ContainerStarted","Data":"805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023"} Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.338455 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vc7sv"] Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.341819 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.360361 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vc7sv"] Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.424090 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd67x\" (UniqueName: \"kubernetes.io/projected/4e850845-b9ce-4ea9-9081-3019b92a874c-kube-api-access-sd67x\") pod \"certified-operators-vc7sv\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.424237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-utilities\") pod \"certified-operators-vc7sv\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.424321 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-catalog-content\") pod \"certified-operators-vc7sv\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.526992 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd67x\" (UniqueName: \"kubernetes.io/projected/4e850845-b9ce-4ea9-9081-3019b92a874c-kube-api-access-sd67x\") pod \"certified-operators-vc7sv\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.527130 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-utilities\") pod \"certified-operators-vc7sv\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.527292 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-catalog-content\") pod \"certified-operators-vc7sv\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.527882 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-utilities\") pod \"certified-operators-vc7sv\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.528065 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-catalog-content\") pod \"certified-operators-vc7sv\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.553510 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd67x\" (UniqueName: \"kubernetes.io/projected/4e850845-b9ce-4ea9-9081-3019b92a874c-kube-api-access-sd67x\") pod \"certified-operators-vc7sv\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.662868 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.940641 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r8fgr"] Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.944408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:12 crc kubenswrapper[4773]: I1012 20:54:12.959586 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8fgr"] Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.042620 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnwjv\" (UniqueName: \"kubernetes.io/projected/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-kube-api-access-qnwjv\") pod \"community-operators-r8fgr\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.042670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-catalog-content\") pod \"community-operators-r8fgr\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.042955 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-utilities\") pod \"community-operators-r8fgr\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.144376 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-catalog-content\") pod \"community-operators-r8fgr\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.144676 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-utilities\") pod \"community-operators-r8fgr\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.144779 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnwjv\" (UniqueName: \"kubernetes.io/projected/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-kube-api-access-qnwjv\") pod \"community-operators-r8fgr\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.145125 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-catalog-content\") pod \"community-operators-r8fgr\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.145253 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-utilities\") pod \"community-operators-r8fgr\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.167161 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnwjv\" (UniqueName: \"kubernetes.io/projected/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-kube-api-access-qnwjv\") pod \"community-operators-r8fgr\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.205115 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vc7sv"] Oct 12 20:54:13 crc kubenswrapper[4773]: W1012 20:54:13.207544 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e850845_b9ce_4ea9_9081_3019b92a874c.slice/crio-e1f98911438cdfc17946587ed7e7c375569d3787528fb3ab08764948eeac35af WatchSource:0}: Error finding container e1f98911438cdfc17946587ed7e7c375569d3787528fb3ab08764948eeac35af: Status 404 returned error can't find the container with id e1f98911438cdfc17946587ed7e7c375569d3787528fb3ab08764948eeac35af Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.283740 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.312563 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc7sv" event={"ID":"4e850845-b9ce-4ea9-9081-3019b92a874c","Type":"ContainerStarted","Data":"e1f98911438cdfc17946587ed7e7c375569d3787528fb3ab08764948eeac35af"} Oct 12 20:54:13 crc kubenswrapper[4773]: I1012 20:54:13.787785 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8fgr"] Oct 12 20:54:13 crc kubenswrapper[4773]: W1012 20:54:13.801008 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod797bac59_c8b3_40c5_95a5_1f4983c4dcb6.slice/crio-16e709af35f2c7d27542fa2cb4c5c7448c2bfb67dc056a44d088cbd0a55cdbb3 WatchSource:0}: Error finding container 16e709af35f2c7d27542fa2cb4c5c7448c2bfb67dc056a44d088cbd0a55cdbb3: Status 404 returned error can't find the container with id 16e709af35f2c7d27542fa2cb4c5c7448c2bfb67dc056a44d088cbd0a55cdbb3 Oct 12 20:54:14 crc kubenswrapper[4773]: I1012 20:54:14.323217 4773 generic.go:334] "Generic (PLEG): container finished" podID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerID="22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205" exitCode=0 Oct 12 20:54:14 crc kubenswrapper[4773]: I1012 20:54:14.323318 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8fgr" event={"ID":"797bac59-c8b3-40c5-95a5-1f4983c4dcb6","Type":"ContainerDied","Data":"22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205"} Oct 12 20:54:14 crc kubenswrapper[4773]: I1012 20:54:14.323344 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8fgr" event={"ID":"797bac59-c8b3-40c5-95a5-1f4983c4dcb6","Type":"ContainerStarted","Data":"16e709af35f2c7d27542fa2cb4c5c7448c2bfb67dc056a44d088cbd0a55cdbb3"} Oct 12 20:54:14 crc kubenswrapper[4773]: I1012 20:54:14.328151 4773 generic.go:334] "Generic (PLEG): container finished" podID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerID="2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb" exitCode=0 Oct 12 20:54:14 crc kubenswrapper[4773]: I1012 20:54:14.328192 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc7sv" event={"ID":"4e850845-b9ce-4ea9-9081-3019b92a874c","Type":"ContainerDied","Data":"2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb"} Oct 12 20:54:15 crc kubenswrapper[4773]: I1012 20:54:15.337304 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8fgr" event={"ID":"797bac59-c8b3-40c5-95a5-1f4983c4dcb6","Type":"ContainerStarted","Data":"046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63"} Oct 12 20:54:15 crc kubenswrapper[4773]: I1012 20:54:15.339324 4773 generic.go:334] "Generic (PLEG): container finished" podID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerID="805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023" exitCode=0 Oct 12 20:54:15 crc kubenswrapper[4773]: I1012 20:54:15.339350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9d4h" event={"ID":"8e93a34f-e7eb-4209-8b53-8f40697a55c8","Type":"ContainerDied","Data":"805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023"} Oct 12 20:54:16 crc kubenswrapper[4773]: I1012 20:54:16.359467 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc7sv" event={"ID":"4e850845-b9ce-4ea9-9081-3019b92a874c","Type":"ContainerStarted","Data":"46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef"} Oct 12 20:54:17 crc kubenswrapper[4773]: I1012 20:54:17.368620 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9d4h" event={"ID":"8e93a34f-e7eb-4209-8b53-8f40697a55c8","Type":"ContainerStarted","Data":"29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180"} Oct 12 20:54:17 crc kubenswrapper[4773]: I1012 20:54:17.372023 4773 generic.go:334] "Generic (PLEG): container finished" podID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerID="046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63" exitCode=0 Oct 12 20:54:17 crc kubenswrapper[4773]: I1012 20:54:17.372437 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8fgr" event={"ID":"797bac59-c8b3-40c5-95a5-1f4983c4dcb6","Type":"ContainerDied","Data":"046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63"} Oct 12 20:54:17 crc kubenswrapper[4773]: I1012 20:54:17.393997 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w9d4h" podStartSLOduration=3.295226782 podStartE2EDuration="9.393917788s" podCreationTimestamp="2025-10-12 20:54:08 +0000 UTC" firstStartedPulling="2025-10-12 20:54:10.28612845 +0000 UTC m=+1798.522427010" lastFinishedPulling="2025-10-12 20:54:16.384819456 +0000 UTC m=+1804.621118016" observedRunningTime="2025-10-12 20:54:17.38970961 +0000 UTC m=+1805.626008170" watchObservedRunningTime="2025-10-12 20:54:17.393917788 +0000 UTC m=+1805.630216348" Oct 12 20:54:18 crc kubenswrapper[4773]: I1012 20:54:18.379591 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8fgr" event={"ID":"797bac59-c8b3-40c5-95a5-1f4983c4dcb6","Type":"ContainerStarted","Data":"02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea"} Oct 12 20:54:18 crc kubenswrapper[4773]: I1012 20:54:18.381083 4773 generic.go:334] "Generic (PLEG): container finished" podID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerID="46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef" exitCode=0 Oct 12 20:54:18 crc kubenswrapper[4773]: I1012 20:54:18.381111 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc7sv" event={"ID":"4e850845-b9ce-4ea9-9081-3019b92a874c","Type":"ContainerDied","Data":"46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef"} Oct 12 20:54:18 crc kubenswrapper[4773]: I1012 20:54:18.427551 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r8fgr" podStartSLOduration=2.862226931 podStartE2EDuration="6.427533543s" podCreationTimestamp="2025-10-12 20:54:12 +0000 UTC" firstStartedPulling="2025-10-12 20:54:14.324713246 +0000 UTC m=+1802.561011826" lastFinishedPulling="2025-10-12 20:54:17.890019878 +0000 UTC m=+1806.126318438" observedRunningTime="2025-10-12 20:54:18.404250093 +0000 UTC m=+1806.640548653" watchObservedRunningTime="2025-10-12 20:54:18.427533543 +0000 UTC m=+1806.663832103" Oct 12 20:54:19 crc kubenswrapper[4773]: I1012 20:54:19.065978 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:19 crc kubenswrapper[4773]: I1012 20:54:19.066040 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:19 crc kubenswrapper[4773]: I1012 20:54:19.394926 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc7sv" event={"ID":"4e850845-b9ce-4ea9-9081-3019b92a874c","Type":"ContainerStarted","Data":"5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0"} Oct 12 20:54:19 crc kubenswrapper[4773]: I1012 20:54:19.430620 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vc7sv" podStartSLOduration=2.95144955 podStartE2EDuration="7.430602795s" podCreationTimestamp="2025-10-12 20:54:12 +0000 UTC" firstStartedPulling="2025-10-12 20:54:14.329756976 +0000 UTC m=+1802.566055536" lastFinishedPulling="2025-10-12 20:54:18.808910231 +0000 UTC m=+1807.045208781" observedRunningTime="2025-10-12 20:54:19.421882922 +0000 UTC m=+1807.658181482" watchObservedRunningTime="2025-10-12 20:54:19.430602795 +0000 UTC m=+1807.666901355" Oct 12 20:54:20 crc kubenswrapper[4773]: I1012 20:54:20.115120 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w9d4h" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="registry-server" probeResult="failure" output=< Oct 12 20:54:20 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 20:54:20 crc kubenswrapper[4773]: > Oct 12 20:54:20 crc kubenswrapper[4773]: I1012 20:54:20.481588 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:54:20 crc kubenswrapper[4773]: E1012 20:54:20.481824 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 20:54:22 crc kubenswrapper[4773]: I1012 20:54:22.663989 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:22 crc kubenswrapper[4773]: I1012 20:54:22.664338 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:23 crc kubenswrapper[4773]: I1012 20:54:23.284786 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:23 crc kubenswrapper[4773]: I1012 20:54:23.284835 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:23 crc kubenswrapper[4773]: I1012 20:54:23.350941 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:23 crc kubenswrapper[4773]: I1012 20:54:23.473133 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:23 crc kubenswrapper[4773]: I1012 20:54:23.710023 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vc7sv" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerName="registry-server" probeResult="failure" output=< Oct 12 20:54:23 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 20:54:23 crc kubenswrapper[4773]: > Oct 12 20:54:23 crc kubenswrapper[4773]: I1012 20:54:23.921952 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8fgr"] Oct 12 20:54:25 crc kubenswrapper[4773]: I1012 20:54:25.440274 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r8fgr" podUID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerName="registry-server" containerID="cri-o://02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea" gracePeriod=2 Oct 12 20:54:25 crc kubenswrapper[4773]: I1012 20:54:25.893447 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:25 crc kubenswrapper[4773]: I1012 20:54:25.925605 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnwjv\" (UniqueName: \"kubernetes.io/projected/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-kube-api-access-qnwjv\") pod \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " Oct 12 20:54:25 crc kubenswrapper[4773]: I1012 20:54:25.925748 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-utilities\") pod \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " Oct 12 20:54:25 crc kubenswrapper[4773]: I1012 20:54:25.925809 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-catalog-content\") pod \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\" (UID: \"797bac59-c8b3-40c5-95a5-1f4983c4dcb6\") " Oct 12 20:54:25 crc kubenswrapper[4773]: I1012 20:54:25.927181 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-utilities" (OuterVolumeSpecName: "utilities") pod "797bac59-c8b3-40c5-95a5-1f4983c4dcb6" (UID: "797bac59-c8b3-40c5-95a5-1f4983c4dcb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:54:25 crc kubenswrapper[4773]: I1012 20:54:25.951955 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-kube-api-access-qnwjv" (OuterVolumeSpecName: "kube-api-access-qnwjv") pod "797bac59-c8b3-40c5-95a5-1f4983c4dcb6" (UID: "797bac59-c8b3-40c5-95a5-1f4983c4dcb6"). InnerVolumeSpecName "kube-api-access-qnwjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:54:25 crc kubenswrapper[4773]: I1012 20:54:25.996440 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "797bac59-c8b3-40c5-95a5-1f4983c4dcb6" (UID: "797bac59-c8b3-40c5-95a5-1f4983c4dcb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.028069 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnwjv\" (UniqueName: \"kubernetes.io/projected/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-kube-api-access-qnwjv\") on node \"crc\" DevicePath \"\"" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.028098 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.028107 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/797bac59-c8b3-40c5-95a5-1f4983c4dcb6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.450384 4773 generic.go:334] "Generic (PLEG): container finished" podID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerID="02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea" exitCode=0 Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.450456 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8fgr" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.450451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8fgr" event={"ID":"797bac59-c8b3-40c5-95a5-1f4983c4dcb6","Type":"ContainerDied","Data":"02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea"} Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.450849 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8fgr" event={"ID":"797bac59-c8b3-40c5-95a5-1f4983c4dcb6","Type":"ContainerDied","Data":"16e709af35f2c7d27542fa2cb4c5c7448c2bfb67dc056a44d088cbd0a55cdbb3"} Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.450887 4773 scope.go:117] "RemoveContainer" containerID="02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.478636 4773 scope.go:117] "RemoveContainer" containerID="046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.495890 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8fgr"] Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.495936 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r8fgr"] Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.505776 4773 scope.go:117] "RemoveContainer" containerID="22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.555932 4773 scope.go:117] "RemoveContainer" containerID="02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea" Oct 12 20:54:26 crc kubenswrapper[4773]: E1012 20:54:26.556262 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea\": container with ID starting with 02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea not found: ID does not exist" containerID="02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.556323 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea"} err="failed to get container status \"02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea\": rpc error: code = NotFound desc = could not find container \"02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea\": container with ID starting with 02dd4dbdf34c3e941dd9bfa4b08d250c83647d186db13bd78ec47b69b74dd1ea not found: ID does not exist" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.556349 4773 scope.go:117] "RemoveContainer" containerID="046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63" Oct 12 20:54:26 crc kubenswrapper[4773]: E1012 20:54:26.556567 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63\": container with ID starting with 046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63 not found: ID does not exist" containerID="046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.556592 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63"} err="failed to get container status \"046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63\": rpc error: code = NotFound desc = could not find container \"046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63\": container with ID starting with 046186d85577324bebaabf0e1fd70637c3f271b14ac0a7a988bdfab683aecb63 not found: ID does not exist" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.556606 4773 scope.go:117] "RemoveContainer" containerID="22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205" Oct 12 20:54:26 crc kubenswrapper[4773]: E1012 20:54:26.557074 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205\": container with ID starting with 22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205 not found: ID does not exist" containerID="22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205" Oct 12 20:54:26 crc kubenswrapper[4773]: I1012 20:54:26.557101 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205"} err="failed to get container status \"22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205\": rpc error: code = NotFound desc = could not find container \"22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205\": container with ID starting with 22a94176e04da50e3e43178b3e84f7d11e6708b30c14f000c443b3184a571205 not found: ID does not exist" Oct 12 20:54:28 crc kubenswrapper[4773]: I1012 20:54:28.492755 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" path="/var/lib/kubelet/pods/797bac59-c8b3-40c5-95a5-1f4983c4dcb6/volumes" Oct 12 20:54:30 crc kubenswrapper[4773]: I1012 20:54:30.110345 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w9d4h" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="registry-server" probeResult="failure" output=< Oct 12 20:54:30 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 20:54:30 crc kubenswrapper[4773]: > Oct 12 20:54:32 crc kubenswrapper[4773]: I1012 20:54:32.492460 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:54:32 crc kubenswrapper[4773]: I1012 20:54:32.727170 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:32 crc kubenswrapper[4773]: I1012 20:54:32.782674 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:33 crc kubenswrapper[4773]: I1012 20:54:33.527555 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"c2b7161e75b51c032d700e35ee841759ad6f99f486fbe60853468eaca3289d2d"} Oct 12 20:54:36 crc kubenswrapper[4773]: I1012 20:54:36.665707 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vc7sv"] Oct 12 20:54:36 crc kubenswrapper[4773]: I1012 20:54:36.667347 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vc7sv" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerName="registry-server" containerID="cri-o://5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0" gracePeriod=2 Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.122014 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.230752 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd67x\" (UniqueName: \"kubernetes.io/projected/4e850845-b9ce-4ea9-9081-3019b92a874c-kube-api-access-sd67x\") pod \"4e850845-b9ce-4ea9-9081-3019b92a874c\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.230950 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-utilities\") pod \"4e850845-b9ce-4ea9-9081-3019b92a874c\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.231746 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-utilities" (OuterVolumeSpecName: "utilities") pod "4e850845-b9ce-4ea9-9081-3019b92a874c" (UID: "4e850845-b9ce-4ea9-9081-3019b92a874c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.231845 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-catalog-content\") pod \"4e850845-b9ce-4ea9-9081-3019b92a874c\" (UID: \"4e850845-b9ce-4ea9-9081-3019b92a874c\") " Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.235559 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.238902 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e850845-b9ce-4ea9-9081-3019b92a874c-kube-api-access-sd67x" (OuterVolumeSpecName: "kube-api-access-sd67x") pod "4e850845-b9ce-4ea9-9081-3019b92a874c" (UID: "4e850845-b9ce-4ea9-9081-3019b92a874c"). InnerVolumeSpecName "kube-api-access-sd67x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.276544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e850845-b9ce-4ea9-9081-3019b92a874c" (UID: "4e850845-b9ce-4ea9-9081-3019b92a874c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.337454 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd67x\" (UniqueName: \"kubernetes.io/projected/4e850845-b9ce-4ea9-9081-3019b92a874c-kube-api-access-sd67x\") on node \"crc\" DevicePath \"\"" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.337495 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e850845-b9ce-4ea9-9081-3019b92a874c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.565235 4773 generic.go:334] "Generic (PLEG): container finished" podID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerID="5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0" exitCode=0 Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.565304 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc7sv" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.565337 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc7sv" event={"ID":"4e850845-b9ce-4ea9-9081-3019b92a874c","Type":"ContainerDied","Data":"5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0"} Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.566037 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc7sv" event={"ID":"4e850845-b9ce-4ea9-9081-3019b92a874c","Type":"ContainerDied","Data":"e1f98911438cdfc17946587ed7e7c375569d3787528fb3ab08764948eeac35af"} Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.566064 4773 scope.go:117] "RemoveContainer" containerID="5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.587315 4773 scope.go:117] "RemoveContainer" containerID="46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.616016 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vc7sv"] Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.624241 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vc7sv"] Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.626068 4773 scope.go:117] "RemoveContainer" containerID="2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.646027 4773 scope.go:117] "RemoveContainer" containerID="5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0" Oct 12 20:54:37 crc kubenswrapper[4773]: E1012 20:54:37.646377 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0\": container with ID starting with 5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0 not found: ID does not exist" containerID="5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.646427 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0"} err="failed to get container status \"5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0\": rpc error: code = NotFound desc = could not find container \"5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0\": container with ID starting with 5ca95f5edb7e69fefcea8d67f3c067bc53d2be681363e9598b00841c45bacde0 not found: ID does not exist" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.646449 4773 scope.go:117] "RemoveContainer" containerID="46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef" Oct 12 20:54:37 crc kubenswrapper[4773]: E1012 20:54:37.647015 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef\": container with ID starting with 46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef not found: ID does not exist" containerID="46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.647133 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef"} err="failed to get container status \"46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef\": rpc error: code = NotFound desc = could not find container \"46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef\": container with ID starting with 46c50df472eb64f59f88a09806955d7fe81bf123868a30c7d29c847132bad5ef not found: ID does not exist" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.647220 4773 scope.go:117] "RemoveContainer" containerID="2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb" Oct 12 20:54:37 crc kubenswrapper[4773]: E1012 20:54:37.647682 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb\": container with ID starting with 2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb not found: ID does not exist" containerID="2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb" Oct 12 20:54:37 crc kubenswrapper[4773]: I1012 20:54:37.647704 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb"} err="failed to get container status \"2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb\": rpc error: code = NotFound desc = could not find container \"2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb\": container with ID starting with 2ce9ca0d45b597a4458771a62d32840f430c0aea8a834580c72d08edf6139fcb not found: ID does not exist" Oct 12 20:54:38 crc kubenswrapper[4773]: I1012 20:54:38.497267 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" path="/var/lib/kubelet/pods/4e850845-b9ce-4ea9-9081-3019b92a874c/volumes" Oct 12 20:54:40 crc kubenswrapper[4773]: I1012 20:54:40.113552 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w9d4h" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="registry-server" probeResult="failure" output=< Oct 12 20:54:40 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 20:54:40 crc kubenswrapper[4773]: > Oct 12 20:54:49 crc kubenswrapper[4773]: I1012 20:54:49.106998 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:49 crc kubenswrapper[4773]: I1012 20:54:49.153811 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:49 crc kubenswrapper[4773]: I1012 20:54:49.349756 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w9d4h"] Oct 12 20:54:50 crc kubenswrapper[4773]: I1012 20:54:50.678428 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w9d4h" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="registry-server" containerID="cri-o://29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180" gracePeriod=2 Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.135417 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.214586 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-utilities\") pod \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.214653 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgrrs\" (UniqueName: \"kubernetes.io/projected/8e93a34f-e7eb-4209-8b53-8f40697a55c8-kube-api-access-xgrrs\") pod \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.214804 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-catalog-content\") pod \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\" (UID: \"8e93a34f-e7eb-4209-8b53-8f40697a55c8\") " Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.216275 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-utilities" (OuterVolumeSpecName: "utilities") pod "8e93a34f-e7eb-4209-8b53-8f40697a55c8" (UID: "8e93a34f-e7eb-4209-8b53-8f40697a55c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.224394 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e93a34f-e7eb-4209-8b53-8f40697a55c8-kube-api-access-xgrrs" (OuterVolumeSpecName: "kube-api-access-xgrrs") pod "8e93a34f-e7eb-4209-8b53-8f40697a55c8" (UID: "8e93a34f-e7eb-4209-8b53-8f40697a55c8"). InnerVolumeSpecName "kube-api-access-xgrrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.309230 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e93a34f-e7eb-4209-8b53-8f40697a55c8" (UID: "8e93a34f-e7eb-4209-8b53-8f40697a55c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.316602 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.316689 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e93a34f-e7eb-4209-8b53-8f40697a55c8-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.316894 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgrrs\" (UniqueName: \"kubernetes.io/projected/8e93a34f-e7eb-4209-8b53-8f40697a55c8-kube-api-access-xgrrs\") on node \"crc\" DevicePath \"\"" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.691836 4773 generic.go:334] "Generic (PLEG): container finished" podID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerID="29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180" exitCode=0 Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.691889 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9d4h" event={"ID":"8e93a34f-e7eb-4209-8b53-8f40697a55c8","Type":"ContainerDied","Data":"29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180"} Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.691919 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9d4h" event={"ID":"8e93a34f-e7eb-4209-8b53-8f40697a55c8","Type":"ContainerDied","Data":"b50fe93314a1c00d9f4b0aae61e6faa5d93c4a2ba6507fb8c54197972adc14ba"} Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.691923 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w9d4h" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.691940 4773 scope.go:117] "RemoveContainer" containerID="29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.720511 4773 scope.go:117] "RemoveContainer" containerID="805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.745870 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w9d4h"] Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.758610 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w9d4h"] Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.762567 4773 scope.go:117] "RemoveContainer" containerID="45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.799107 4773 scope.go:117] "RemoveContainer" containerID="29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180" Oct 12 20:54:51 crc kubenswrapper[4773]: E1012 20:54:51.799867 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180\": container with ID starting with 29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180 not found: ID does not exist" containerID="29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.799925 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180"} err="failed to get container status \"29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180\": rpc error: code = NotFound desc = could not find container \"29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180\": container with ID starting with 29f48e3971b05726399564f6d79257d344f7ccbce662ae41805c5230c6f87180 not found: ID does not exist" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.799960 4773 scope.go:117] "RemoveContainer" containerID="805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023" Oct 12 20:54:51 crc kubenswrapper[4773]: E1012 20:54:51.802705 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023\": container with ID starting with 805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023 not found: ID does not exist" containerID="805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.802758 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023"} err="failed to get container status \"805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023\": rpc error: code = NotFound desc = could not find container \"805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023\": container with ID starting with 805d3d026738f2cfd1c928d303f4a692b166380db10b118c0cfab079c6218023 not found: ID does not exist" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.802777 4773 scope.go:117] "RemoveContainer" containerID="45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa" Oct 12 20:54:51 crc kubenswrapper[4773]: E1012 20:54:51.803160 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa\": container with ID starting with 45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa not found: ID does not exist" containerID="45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa" Oct 12 20:54:51 crc kubenswrapper[4773]: I1012 20:54:51.803193 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa"} err="failed to get container status \"45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa\": rpc error: code = NotFound desc = could not find container \"45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa\": container with ID starting with 45f7615c49dc1d23ea6b754a70ebbb523d8071ca47cabe0727b622427eebc6fa not found: ID does not exist" Oct 12 20:54:52 crc kubenswrapper[4773]: I1012 20:54:52.501535 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" path="/var/lib/kubelet/pods/8e93a34f-e7eb-4209-8b53-8f40697a55c8/volumes" Oct 12 20:56:58 crc kubenswrapper[4773]: I1012 20:56:58.669648 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:56:58 crc kubenswrapper[4773]: I1012 20:56:58.672501 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:57:28 crc kubenswrapper[4773]: I1012 20:57:28.669925 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:57:28 crc kubenswrapper[4773]: I1012 20:57:28.670749 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:57:58 crc kubenswrapper[4773]: I1012 20:57:58.669799 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 20:57:58 crc kubenswrapper[4773]: I1012 20:57:58.670592 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 20:57:58 crc kubenswrapper[4773]: I1012 20:57:58.670660 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 20:57:58 crc kubenswrapper[4773]: I1012 20:57:58.671854 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2b7161e75b51c032d700e35ee841759ad6f99f486fbe60853468eaca3289d2d"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 20:57:58 crc kubenswrapper[4773]: I1012 20:57:58.671959 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://c2b7161e75b51c032d700e35ee841759ad6f99f486fbe60853468eaca3289d2d" gracePeriod=600 Oct 12 20:57:59 crc kubenswrapper[4773]: I1012 20:57:59.539102 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="c2b7161e75b51c032d700e35ee841759ad6f99f486fbe60853468eaca3289d2d" exitCode=0 Oct 12 20:57:59 crc kubenswrapper[4773]: I1012 20:57:59.539311 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"c2b7161e75b51c032d700e35ee841759ad6f99f486fbe60853468eaca3289d2d"} Oct 12 20:57:59 crc kubenswrapper[4773]: I1012 20:57:59.539899 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8"} Oct 12 20:57:59 crc kubenswrapper[4773]: I1012 20:57:59.539936 4773 scope.go:117] "RemoveContainer" containerID="44647eaf6fabad84f00033c8fa264523feac419f330054ba6016201ba8abfe97" Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.685475 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.697697 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.706815 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.715100 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.721351 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.727861 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.735215 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.741228 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6m46l"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.746924 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9vttw"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.754117 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wxl6q"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.762867 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jjjj2"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.770758 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-twfqx"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.778367 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.786087 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.791793 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-x242m"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.796835 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rw7sg"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.801990 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2d4fl"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.807152 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jjjj2"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.812191 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.816906 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rrr4w"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.821770 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p27qc"] Oct 12 20:58:54 crc kubenswrapper[4773]: I1012 20:58:54.826446 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-9s266"] Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.498336 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5" path="/var/lib/kubelet/pods/076ce5eb-e8e9-45ff-a1ef-8b0e60fd13f5/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.499459 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f75579f-926c-4f64-9c96-187d31dbe98a" path="/var/lib/kubelet/pods/0f75579f-926c-4f64-9c96-187d31dbe98a/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.500551 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110bf492-a57a-4b15-9785-ba1947f4d06b" path="/var/lib/kubelet/pods/110bf492-a57a-4b15-9785-ba1947f4d06b/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.501666 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17eb0758-b5c9-4f05-8208-b105edc12dc9" path="/var/lib/kubelet/pods/17eb0758-b5c9-4f05-8208-b105edc12dc9/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.503844 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa00062-9363-462c-93d6-5552cf5d1c9c" path="/var/lib/kubelet/pods/1aa00062-9363-462c-93d6-5552cf5d1c9c/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.505091 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3816a145-333d-4188-9caa-656401fad30d" path="/var/lib/kubelet/pods/3816a145-333d-4188-9caa-656401fad30d/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.506310 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6247dcc5-2188-4112-b4b8-b53023878263" path="/var/lib/kubelet/pods/6247dcc5-2188-4112-b4b8-b53023878263/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.507060 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9129ea8a-c6c0-4ec6-9667-a3513b897a45" path="/var/lib/kubelet/pods/9129ea8a-c6c0-4ec6-9667-a3513b897a45/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.507680 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a82d170b-f36c-4d2e-b522-99c93c9d2685" path="/var/lib/kubelet/pods/a82d170b-f36c-4d2e-b522-99c93c9d2685/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.508196 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a89819-9b05-4de3-988d-47f353e73656" path="/var/lib/kubelet/pods/b4a89819-9b05-4de3-988d-47f353e73656/volumes" Oct 12 20:58:56 crc kubenswrapper[4773]: I1012 20:58:56.509195 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f23068cc-f12e-482e-894c-701621db18e3" path="/var/lib/kubelet/pods/f23068cc-f12e-482e-894c-701621db18e3/volumes" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.149637 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mfwk2"] Oct 12 20:59:04 crc kubenswrapper[4773]: E1012 20:59:04.151304 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerName="extract-utilities" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151326 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerName="extract-utilities" Oct 12 20:59:04 crc kubenswrapper[4773]: E1012 20:59:04.151336 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="registry-server" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151344 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="registry-server" Oct 12 20:59:04 crc kubenswrapper[4773]: E1012 20:59:04.151374 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="extract-utilities" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151382 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="extract-utilities" Oct 12 20:59:04 crc kubenswrapper[4773]: E1012 20:59:04.151394 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerName="extract-content" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151404 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerName="extract-content" Oct 12 20:59:04 crc kubenswrapper[4773]: E1012 20:59:04.151419 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerName="registry-server" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151426 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerName="registry-server" Oct 12 20:59:04 crc kubenswrapper[4773]: E1012 20:59:04.151442 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerName="extract-content" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151450 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerName="extract-content" Oct 12 20:59:04 crc kubenswrapper[4773]: E1012 20:59:04.151463 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="extract-content" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151470 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="extract-content" Oct 12 20:59:04 crc kubenswrapper[4773]: E1012 20:59:04.151484 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerName="registry-server" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151493 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerName="registry-server" Oct 12 20:59:04 crc kubenswrapper[4773]: E1012 20:59:04.151510 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerName="extract-utilities" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151518 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerName="extract-utilities" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151745 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e93a34f-e7eb-4209-8b53-8f40697a55c8" containerName="registry-server" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151787 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="797bac59-c8b3-40c5-95a5-1f4983c4dcb6" containerName="registry-server" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.151801 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e850845-b9ce-4ea9-9081-3019b92a874c" containerName="registry-server" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.156707 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.179331 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfwk2"] Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.274275 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-utilities\") pod \"redhat-marketplace-mfwk2\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.274407 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-catalog-content\") pod \"redhat-marketplace-mfwk2\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.274469 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vhn2\" (UniqueName: \"kubernetes.io/projected/f78df811-212d-47d4-ac5a-9d10ce4cc453-kube-api-access-6vhn2\") pod \"redhat-marketplace-mfwk2\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.377354 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-utilities\") pod \"redhat-marketplace-mfwk2\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.377499 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-catalog-content\") pod \"redhat-marketplace-mfwk2\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.377564 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vhn2\" (UniqueName: \"kubernetes.io/projected/f78df811-212d-47d4-ac5a-9d10ce4cc453-kube-api-access-6vhn2\") pod \"redhat-marketplace-mfwk2\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.377984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-utilities\") pod \"redhat-marketplace-mfwk2\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.378211 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-catalog-content\") pod \"redhat-marketplace-mfwk2\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.438336 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vhn2\" (UniqueName: \"kubernetes.io/projected/f78df811-212d-47d4-ac5a-9d10ce4cc453-kube-api-access-6vhn2\") pod \"redhat-marketplace-mfwk2\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.481675 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:04 crc kubenswrapper[4773]: I1012 20:59:04.972658 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfwk2"] Oct 12 20:59:05 crc kubenswrapper[4773]: I1012 20:59:05.168033 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfwk2" event={"ID":"f78df811-212d-47d4-ac5a-9d10ce4cc453","Type":"ContainerStarted","Data":"f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c"} Oct 12 20:59:05 crc kubenswrapper[4773]: I1012 20:59:05.169040 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfwk2" event={"ID":"f78df811-212d-47d4-ac5a-9d10ce4cc453","Type":"ContainerStarted","Data":"1dcd3d762ce10b0e2e22f919d8d5cb3c283bc8a7ec9a0af15885f5d59ec92977"} Oct 12 20:59:06 crc kubenswrapper[4773]: I1012 20:59:06.187246 4773 generic.go:334] "Generic (PLEG): container finished" podID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerID="f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c" exitCode=0 Oct 12 20:59:06 crc kubenswrapper[4773]: I1012 20:59:06.187317 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfwk2" event={"ID":"f78df811-212d-47d4-ac5a-9d10ce4cc453","Type":"ContainerDied","Data":"f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c"} Oct 12 20:59:06 crc kubenswrapper[4773]: I1012 20:59:06.190065 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.209671 4773 generic.go:334] "Generic (PLEG): container finished" podID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerID="07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2" exitCode=0 Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.209789 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfwk2" event={"ID":"f78df811-212d-47d4-ac5a-9d10ce4cc453","Type":"ContainerDied","Data":"07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2"} Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.771822 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p"] Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.773135 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.777295 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.778747 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.779081 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.779397 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.782789 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.785174 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p"] Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.866155 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.866574 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.866624 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/d3045d7b-b25d-4036-bea1-0b5f184476eb-kube-api-access-zs9t7\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.866753 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.866867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.968767 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/d3045d7b-b25d-4036-bea1-0b5f184476eb-kube-api-access-zs9t7\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.968931 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.969083 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.969225 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.969280 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.974060 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.977124 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.980096 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.980660 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:08 crc kubenswrapper[4773]: I1012 20:59:08.984724 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/d3045d7b-b25d-4036-bea1-0b5f184476eb-kube-api-access-zs9t7\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:09 crc kubenswrapper[4773]: I1012 20:59:09.115049 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:09 crc kubenswrapper[4773]: I1012 20:59:09.226943 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfwk2" event={"ID":"f78df811-212d-47d4-ac5a-9d10ce4cc453","Type":"ContainerStarted","Data":"a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21"} Oct 12 20:59:09 crc kubenswrapper[4773]: I1012 20:59:09.273467 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mfwk2" podStartSLOduration=2.850922729 podStartE2EDuration="5.273433153s" podCreationTimestamp="2025-10-12 20:59:04 +0000 UTC" firstStartedPulling="2025-10-12 20:59:06.189754093 +0000 UTC m=+2094.426052673" lastFinishedPulling="2025-10-12 20:59:08.612264517 +0000 UTC m=+2096.848563097" observedRunningTime="2025-10-12 20:59:09.262306033 +0000 UTC m=+2097.498604593" watchObservedRunningTime="2025-10-12 20:59:09.273433153 +0000 UTC m=+2097.509731713" Oct 12 20:59:09 crc kubenswrapper[4773]: I1012 20:59:09.664202 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p"] Oct 12 20:59:09 crc kubenswrapper[4773]: W1012 20:59:09.669971 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3045d7b_b25d_4036_bea1_0b5f184476eb.slice/crio-3b56648855697179bdac282bc7c552d0e5c944e0111a9cb905f0df7aad8f2dd4 WatchSource:0}: Error finding container 3b56648855697179bdac282bc7c552d0e5c944e0111a9cb905f0df7aad8f2dd4: Status 404 returned error can't find the container with id 3b56648855697179bdac282bc7c552d0e5c944e0111a9cb905f0df7aad8f2dd4 Oct 12 20:59:10 crc kubenswrapper[4773]: I1012 20:59:10.240811 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" event={"ID":"d3045d7b-b25d-4036-bea1-0b5f184476eb","Type":"ContainerStarted","Data":"3b56648855697179bdac282bc7c552d0e5c944e0111a9cb905f0df7aad8f2dd4"} Oct 12 20:59:11 crc kubenswrapper[4773]: I1012 20:59:11.254069 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" event={"ID":"d3045d7b-b25d-4036-bea1-0b5f184476eb","Type":"ContainerStarted","Data":"cc1ae94509694c7631671f7dbeb616b833e2ae17278af1865dcdc673a51fd663"} Oct 12 20:59:11 crc kubenswrapper[4773]: I1012 20:59:11.274703 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" podStartSLOduration=2.788952126 podStartE2EDuration="3.274686909s" podCreationTimestamp="2025-10-12 20:59:08 +0000 UTC" firstStartedPulling="2025-10-12 20:59:09.672139103 +0000 UTC m=+2097.908437663" lastFinishedPulling="2025-10-12 20:59:10.157873886 +0000 UTC m=+2098.394172446" observedRunningTime="2025-10-12 20:59:11.27114181 +0000 UTC m=+2099.507440370" watchObservedRunningTime="2025-10-12 20:59:11.274686909 +0000 UTC m=+2099.510985469" Oct 12 20:59:14 crc kubenswrapper[4773]: I1012 20:59:14.538264 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:14 crc kubenswrapper[4773]: I1012 20:59:14.538927 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:14 crc kubenswrapper[4773]: I1012 20:59:14.575831 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:15 crc kubenswrapper[4773]: I1012 20:59:15.346560 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:15 crc kubenswrapper[4773]: I1012 20:59:15.419767 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfwk2"] Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.299012 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mfwk2" podUID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerName="registry-server" containerID="cri-o://a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21" gracePeriod=2 Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.740381 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.829726 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-utilities\") pod \"f78df811-212d-47d4-ac5a-9d10ce4cc453\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.829898 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-catalog-content\") pod \"f78df811-212d-47d4-ac5a-9d10ce4cc453\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.830067 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vhn2\" (UniqueName: \"kubernetes.io/projected/f78df811-212d-47d4-ac5a-9d10ce4cc453-kube-api-access-6vhn2\") pod \"f78df811-212d-47d4-ac5a-9d10ce4cc453\" (UID: \"f78df811-212d-47d4-ac5a-9d10ce4cc453\") " Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.836104 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78df811-212d-47d4-ac5a-9d10ce4cc453-kube-api-access-6vhn2" (OuterVolumeSpecName: "kube-api-access-6vhn2") pod "f78df811-212d-47d4-ac5a-9d10ce4cc453" (UID: "f78df811-212d-47d4-ac5a-9d10ce4cc453"). InnerVolumeSpecName "kube-api-access-6vhn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.837863 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-utilities" (OuterVolumeSpecName: "utilities") pod "f78df811-212d-47d4-ac5a-9d10ce4cc453" (UID: "f78df811-212d-47d4-ac5a-9d10ce4cc453"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.843679 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f78df811-212d-47d4-ac5a-9d10ce4cc453" (UID: "f78df811-212d-47d4-ac5a-9d10ce4cc453"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.932363 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vhn2\" (UniqueName: \"kubernetes.io/projected/f78df811-212d-47d4-ac5a-9d10ce4cc453-kube-api-access-6vhn2\") on node \"crc\" DevicePath \"\"" Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.932405 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 20:59:17 crc kubenswrapper[4773]: I1012 20:59:17.932420 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f78df811-212d-47d4-ac5a-9d10ce4cc453-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.308978 4773 generic.go:334] "Generic (PLEG): container finished" podID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerID="a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21" exitCode=0 Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.309019 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfwk2" event={"ID":"f78df811-212d-47d4-ac5a-9d10ce4cc453","Type":"ContainerDied","Data":"a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21"} Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.309052 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfwk2" event={"ID":"f78df811-212d-47d4-ac5a-9d10ce4cc453","Type":"ContainerDied","Data":"1dcd3d762ce10b0e2e22f919d8d5cb3c283bc8a7ec9a0af15885f5d59ec92977"} Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.309069 4773 scope.go:117] "RemoveContainer" containerID="a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.309085 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mfwk2" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.340971 4773 scope.go:117] "RemoveContainer" containerID="07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.355658 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfwk2"] Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.361605 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfwk2"] Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.363178 4773 scope.go:117] "RemoveContainer" containerID="f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.415991 4773 scope.go:117] "RemoveContainer" containerID="a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21" Oct 12 20:59:18 crc kubenswrapper[4773]: E1012 20:59:18.416789 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21\": container with ID starting with a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21 not found: ID does not exist" containerID="a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.416823 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21"} err="failed to get container status \"a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21\": rpc error: code = NotFound desc = could not find container \"a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21\": container with ID starting with a5e539600606a97a4b0697033907bc6937203fb80472b7d97b4a59b331280a21 not found: ID does not exist" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.416862 4773 scope.go:117] "RemoveContainer" containerID="07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2" Oct 12 20:59:18 crc kubenswrapper[4773]: E1012 20:59:18.417236 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2\": container with ID starting with 07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2 not found: ID does not exist" containerID="07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.417259 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2"} err="failed to get container status \"07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2\": rpc error: code = NotFound desc = could not find container \"07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2\": container with ID starting with 07d5d39aeb4e71956f17b47e5977580a47eb1b9e888aab53dd0ff51bbb5a79f2 not found: ID does not exist" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.417289 4773 scope.go:117] "RemoveContainer" containerID="f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c" Oct 12 20:59:18 crc kubenswrapper[4773]: E1012 20:59:18.417837 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c\": container with ID starting with f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c not found: ID does not exist" containerID="f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.417855 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c"} err="failed to get container status \"f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c\": rpc error: code = NotFound desc = could not find container \"f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c\": container with ID starting with f50c041a582fa93b238b742fece86ec1779d283c74dee0c0d8dbcfafa3c54c1c not found: ID does not exist" Oct 12 20:59:18 crc kubenswrapper[4773]: I1012 20:59:18.491367 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78df811-212d-47d4-ac5a-9d10ce4cc453" path="/var/lib/kubelet/pods/f78df811-212d-47d4-ac5a-9d10ce4cc453/volumes" Oct 12 20:59:22 crc kubenswrapper[4773]: I1012 20:59:22.344599 4773 generic.go:334] "Generic (PLEG): container finished" podID="d3045d7b-b25d-4036-bea1-0b5f184476eb" containerID="cc1ae94509694c7631671f7dbeb616b833e2ae17278af1865dcdc673a51fd663" exitCode=0 Oct 12 20:59:22 crc kubenswrapper[4773]: I1012 20:59:22.344695 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" event={"ID":"d3045d7b-b25d-4036-bea1-0b5f184476eb","Type":"ContainerDied","Data":"cc1ae94509694c7631671f7dbeb616b833e2ae17278af1865dcdc673a51fd663"} Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.752626 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.842686 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-inventory\") pod \"d3045d7b-b25d-4036-bea1-0b5f184476eb\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.842900 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/d3045d7b-b25d-4036-bea1-0b5f184476eb-kube-api-access-zs9t7\") pod \"d3045d7b-b25d-4036-bea1-0b5f184476eb\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.842941 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ceph\") pod \"d3045d7b-b25d-4036-bea1-0b5f184476eb\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.842988 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-repo-setup-combined-ca-bundle\") pod \"d3045d7b-b25d-4036-bea1-0b5f184476eb\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.843104 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ssh-key\") pod \"d3045d7b-b25d-4036-bea1-0b5f184476eb\" (UID: \"d3045d7b-b25d-4036-bea1-0b5f184476eb\") " Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.847764 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ceph" (OuterVolumeSpecName: "ceph") pod "d3045d7b-b25d-4036-bea1-0b5f184476eb" (UID: "d3045d7b-b25d-4036-bea1-0b5f184476eb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.851915 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d3045d7b-b25d-4036-bea1-0b5f184476eb" (UID: "d3045d7b-b25d-4036-bea1-0b5f184476eb"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.858569 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3045d7b-b25d-4036-bea1-0b5f184476eb-kube-api-access-zs9t7" (OuterVolumeSpecName: "kube-api-access-zs9t7") pod "d3045d7b-b25d-4036-bea1-0b5f184476eb" (UID: "d3045d7b-b25d-4036-bea1-0b5f184476eb"). InnerVolumeSpecName "kube-api-access-zs9t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.877027 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d3045d7b-b25d-4036-bea1-0b5f184476eb" (UID: "d3045d7b-b25d-4036-bea1-0b5f184476eb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.878238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-inventory" (OuterVolumeSpecName: "inventory") pod "d3045d7b-b25d-4036-bea1-0b5f184476eb" (UID: "d3045d7b-b25d-4036-bea1-0b5f184476eb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.945534 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.945560 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/d3045d7b-b25d-4036-bea1-0b5f184476eb-kube-api-access-zs9t7\") on node \"crc\" DevicePath \"\"" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.945572 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.945581 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 20:59:23 crc kubenswrapper[4773]: I1012 20:59:23.945589 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3045d7b-b25d-4036-bea1-0b5f184476eb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.368761 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" event={"ID":"d3045d7b-b25d-4036-bea1-0b5f184476eb","Type":"ContainerDied","Data":"3b56648855697179bdac282bc7c552d0e5c944e0111a9cb905f0df7aad8f2dd4"} Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.369196 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b56648855697179bdac282bc7c552d0e5c944e0111a9cb905f0df7aad8f2dd4" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.368849 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.471768 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2"] Oct 12 20:59:24 crc kubenswrapper[4773]: E1012 20:59:24.472154 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerName="extract-utilities" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.472174 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerName="extract-utilities" Oct 12 20:59:24 crc kubenswrapper[4773]: E1012 20:59:24.472189 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3045d7b-b25d-4036-bea1-0b5f184476eb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.472198 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3045d7b-b25d-4036-bea1-0b5f184476eb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 20:59:24 crc kubenswrapper[4773]: E1012 20:59:24.472240 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerName="registry-server" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.472249 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerName="registry-server" Oct 12 20:59:24 crc kubenswrapper[4773]: E1012 20:59:24.472258 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerName="extract-content" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.472267 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerName="extract-content" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.472459 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3045d7b-b25d-4036-bea1-0b5f184476eb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.472477 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78df811-212d-47d4-ac5a-9d10ce4cc453" containerName="registry-server" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.473230 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.475227 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.475485 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.477695 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.477828 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.478746 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.524266 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2"] Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.558382 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.558434 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l45s\" (UniqueName: \"kubernetes.io/projected/25b3f977-6673-4aa8-aadc-89d98ceb7638-kube-api-access-8l45s\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.558714 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.558881 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.558923 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.660835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.660888 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.660913 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.661041 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.661061 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l45s\" (UniqueName: \"kubernetes.io/projected/25b3f977-6673-4aa8-aadc-89d98ceb7638-kube-api-access-8l45s\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.665383 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.672241 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.673102 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.674267 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.684000 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l45s\" (UniqueName: \"kubernetes.io/projected/25b3f977-6673-4aa8-aadc-89d98ceb7638-kube-api-access-8l45s\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:24 crc kubenswrapper[4773]: I1012 20:59:24.822840 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 20:59:25 crc kubenswrapper[4773]: I1012 20:59:25.363206 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2"] Oct 12 20:59:25 crc kubenswrapper[4773]: I1012 20:59:25.388115 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" event={"ID":"25b3f977-6673-4aa8-aadc-89d98ceb7638","Type":"ContainerStarted","Data":"2b0b84b31d52091e2ebfea6eeddb577e2ead6ec8aaa0850e2f1c443e28521097"} Oct 12 20:59:26 crc kubenswrapper[4773]: I1012 20:59:26.398217 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" event={"ID":"25b3f977-6673-4aa8-aadc-89d98ceb7638","Type":"ContainerStarted","Data":"13130b97f94fe9337363410dbe470459a55b2612cbdac569ef597f976f057562"} Oct 12 20:59:33 crc kubenswrapper[4773]: I1012 20:59:33.503483 4773 scope.go:117] "RemoveContainer" containerID="da7cfea6b651bdb22f9461bff7616e70f28be5c42e4bc498c5d85595d9ea7253" Oct 12 20:59:33 crc kubenswrapper[4773]: I1012 20:59:33.590795 4773 scope.go:117] "RemoveContainer" containerID="d8cdee227a7b62dd4b9180ced6e129799a991f6816eed5b5bc3312af5bc54ac7" Oct 12 20:59:33 crc kubenswrapper[4773]: I1012 20:59:33.654102 4773 scope.go:117] "RemoveContainer" containerID="37503f43cde8b16f5e1e3697155caa5969e9cd108896eb63f0aa389b7012ac71" Oct 12 20:59:33 crc kubenswrapper[4773]: I1012 20:59:33.699170 4773 scope.go:117] "RemoveContainer" containerID="14be1e6da6f776fbf58d53dffd2e4bca6aa8d6498d89ef6990980e7c8aa43cae" Oct 12 20:59:33 crc kubenswrapper[4773]: I1012 20:59:33.729355 4773 scope.go:117] "RemoveContainer" containerID="6fd9d00fd866419e79b85ba425c00b7a84997153f12f6c0c25b1b8d4de8c977c" Oct 12 20:59:33 crc kubenswrapper[4773]: I1012 20:59:33.769542 4773 scope.go:117] "RemoveContainer" containerID="0065fb1508276a0a3e128ec33a9383012dd6f2e4e90616d24d804c66caccccc2" Oct 12 20:59:33 crc kubenswrapper[4773]: I1012 20:59:33.815138 4773 scope.go:117] "RemoveContainer" containerID="0f69f16f07254d1fa3d64642fbb929966189005499c2ab1bb5da4b62b94b42a1" Oct 12 20:59:33 crc kubenswrapper[4773]: I1012 20:59:33.856668 4773 scope.go:117] "RemoveContainer" containerID="87c278063fb752efb628afa1937897c32526a5cfafa8c887ed8e0fa2d787e91d" Oct 12 20:59:33 crc kubenswrapper[4773]: I1012 20:59:33.907243 4773 scope.go:117] "RemoveContainer" containerID="7038f683eeca1036e9a30c385899c9aa4f981e3ad593e1dab28acbf13f2968e0" Oct 12 20:59:34 crc kubenswrapper[4773]: I1012 20:59:34.002796 4773 scope.go:117] "RemoveContainer" containerID="a238ca7d09f0698f071611bc974ca37de1f7baa4ac653ba8d876b48ab9f78b6e" Oct 12 20:59:34 crc kubenswrapper[4773]: I1012 20:59:34.037155 4773 scope.go:117] "RemoveContainer" containerID="1ae31120e8d01dd8881b0c06c40db8e736d61798f1e385c3dd224952e0977dd3" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.143297 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" podStartSLOduration=35.713384668 podStartE2EDuration="36.143269206s" podCreationTimestamp="2025-10-12 20:59:24 +0000 UTC" firstStartedPulling="2025-10-12 20:59:25.374990294 +0000 UTC m=+2113.611288854" lastFinishedPulling="2025-10-12 20:59:25.804874832 +0000 UTC m=+2114.041173392" observedRunningTime="2025-10-12 20:59:26.421148839 +0000 UTC m=+2114.657447419" watchObservedRunningTime="2025-10-12 21:00:00.143269206 +0000 UTC m=+2148.379567786" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.151282 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9"] Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.153295 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.158135 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.158944 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.162841 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9"] Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.238228 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fdc591-be6d-43c2-8ea1-d69358551827-config-volume\") pod \"collect-profiles-29338380-zrtg9\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.238315 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzxq8\" (UniqueName: \"kubernetes.io/projected/20fdc591-be6d-43c2-8ea1-d69358551827-kube-api-access-vzxq8\") pod \"collect-profiles-29338380-zrtg9\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.238341 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fdc591-be6d-43c2-8ea1-d69358551827-secret-volume\") pod \"collect-profiles-29338380-zrtg9\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.340354 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fdc591-be6d-43c2-8ea1-d69358551827-config-volume\") pod \"collect-profiles-29338380-zrtg9\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.340406 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzxq8\" (UniqueName: \"kubernetes.io/projected/20fdc591-be6d-43c2-8ea1-d69358551827-kube-api-access-vzxq8\") pod \"collect-profiles-29338380-zrtg9\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.340430 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fdc591-be6d-43c2-8ea1-d69358551827-secret-volume\") pod \"collect-profiles-29338380-zrtg9\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.342535 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fdc591-be6d-43c2-8ea1-d69358551827-config-volume\") pod \"collect-profiles-29338380-zrtg9\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.347496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fdc591-be6d-43c2-8ea1-d69358551827-secret-volume\") pod \"collect-profiles-29338380-zrtg9\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.355204 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzxq8\" (UniqueName: \"kubernetes.io/projected/20fdc591-be6d-43c2-8ea1-d69358551827-kube-api-access-vzxq8\") pod \"collect-profiles-29338380-zrtg9\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.482432 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:00 crc kubenswrapper[4773]: I1012 21:00:00.894128 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9"] Oct 12 21:00:01 crc kubenswrapper[4773]: I1012 21:00:01.709433 4773 generic.go:334] "Generic (PLEG): container finished" podID="20fdc591-be6d-43c2-8ea1-d69358551827" containerID="bc0d1d6616d9f4070dec2f76f0318fdfd692c664772e2fc717ac9008d55fa187" exitCode=0 Oct 12 21:00:01 crc kubenswrapper[4773]: I1012 21:00:01.709549 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" event={"ID":"20fdc591-be6d-43c2-8ea1-d69358551827","Type":"ContainerDied","Data":"bc0d1d6616d9f4070dec2f76f0318fdfd692c664772e2fc717ac9008d55fa187"} Oct 12 21:00:01 crc kubenswrapper[4773]: I1012 21:00:01.709729 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" event={"ID":"20fdc591-be6d-43c2-8ea1-d69358551827","Type":"ContainerStarted","Data":"0cb16f66aad1a2a9bae2e88f50d2d2152ecd540d687bb29119d7ca9a80bd6d9e"} Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.090075 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.113543 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fdc591-be6d-43c2-8ea1-d69358551827-secret-volume\") pod \"20fdc591-be6d-43c2-8ea1-d69358551827\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.113644 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fdc591-be6d-43c2-8ea1-d69358551827-config-volume\") pod \"20fdc591-be6d-43c2-8ea1-d69358551827\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.113858 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzxq8\" (UniqueName: \"kubernetes.io/projected/20fdc591-be6d-43c2-8ea1-d69358551827-kube-api-access-vzxq8\") pod \"20fdc591-be6d-43c2-8ea1-d69358551827\" (UID: \"20fdc591-be6d-43c2-8ea1-d69358551827\") " Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.114389 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20fdc591-be6d-43c2-8ea1-d69358551827-config-volume" (OuterVolumeSpecName: "config-volume") pod "20fdc591-be6d-43c2-8ea1-d69358551827" (UID: "20fdc591-be6d-43c2-8ea1-d69358551827"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.119902 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20fdc591-be6d-43c2-8ea1-d69358551827-kube-api-access-vzxq8" (OuterVolumeSpecName: "kube-api-access-vzxq8") pod "20fdc591-be6d-43c2-8ea1-d69358551827" (UID: "20fdc591-be6d-43c2-8ea1-d69358551827"). InnerVolumeSpecName "kube-api-access-vzxq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.119931 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20fdc591-be6d-43c2-8ea1-d69358551827-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "20fdc591-be6d-43c2-8ea1-d69358551827" (UID: "20fdc591-be6d-43c2-8ea1-d69358551827"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.216572 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fdc591-be6d-43c2-8ea1-d69358551827-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.216615 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fdc591-be6d-43c2-8ea1-d69358551827-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.216630 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzxq8\" (UniqueName: \"kubernetes.io/projected/20fdc591-be6d-43c2-8ea1-d69358551827-kube-api-access-vzxq8\") on node \"crc\" DevicePath \"\"" Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.728467 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" event={"ID":"20fdc591-be6d-43c2-8ea1-d69358551827","Type":"ContainerDied","Data":"0cb16f66aad1a2a9bae2e88f50d2d2152ecd540d687bb29119d7ca9a80bd6d9e"} Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.728563 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cb16f66aad1a2a9bae2e88f50d2d2152ecd540d687bb29119d7ca9a80bd6d9e" Oct 12 21:00:03 crc kubenswrapper[4773]: I1012 21:00:03.728523 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9" Oct 12 21:00:04 crc kubenswrapper[4773]: I1012 21:00:04.159368 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8"] Oct 12 21:00:04 crc kubenswrapper[4773]: I1012 21:00:04.165008 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338335-9wxc8"] Oct 12 21:00:04 crc kubenswrapper[4773]: I1012 21:00:04.491353 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5565c983-8814-411e-b913-0ea8e4d73c0f" path="/var/lib/kubelet/pods/5565c983-8814-411e-b913-0ea8e4d73c0f/volumes" Oct 12 21:00:28 crc kubenswrapper[4773]: I1012 21:00:28.669571 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:00:28 crc kubenswrapper[4773]: I1012 21:00:28.670318 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:00:34 crc kubenswrapper[4773]: I1012 21:00:34.263924 4773 scope.go:117] "RemoveContainer" containerID="f2734d002fe5e49905f4e0c20eb2c2fd5cbd5a533f6142200d8a6342f90f8c72" Oct 12 21:00:58 crc kubenswrapper[4773]: I1012 21:00:58.670141 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:00:58 crc kubenswrapper[4773]: I1012 21:00:58.670940 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.170338 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29338381-pxljd"] Oct 12 21:01:00 crc kubenswrapper[4773]: E1012 21:01:00.171509 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20fdc591-be6d-43c2-8ea1-d69358551827" containerName="collect-profiles" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.171626 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="20fdc591-be6d-43c2-8ea1-d69358551827" containerName="collect-profiles" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.171965 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="20fdc591-be6d-43c2-8ea1-d69358551827" containerName="collect-profiles" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.172795 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.201308 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29338381-pxljd"] Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.222240 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-fernet-keys\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.222510 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-config-data\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.222662 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtl9\" (UniqueName: \"kubernetes.io/projected/d2812224-3ef8-431f-896d-01d9d78c3650-kube-api-access-mbtl9\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.222784 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-combined-ca-bundle\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.325156 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-fernet-keys\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.325207 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-config-data\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.325284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtl9\" (UniqueName: \"kubernetes.io/projected/d2812224-3ef8-431f-896d-01d9d78c3650-kube-api-access-mbtl9\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.325301 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-combined-ca-bundle\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.332063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-combined-ca-bundle\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.336523 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-config-data\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.345459 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-fernet-keys\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.346953 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtl9\" (UniqueName: \"kubernetes.io/projected/d2812224-3ef8-431f-896d-01d9d78c3650-kube-api-access-mbtl9\") pod \"keystone-cron-29338381-pxljd\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.488021 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:00 crc kubenswrapper[4773]: I1012 21:01:00.957201 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29338381-pxljd"] Oct 12 21:01:01 crc kubenswrapper[4773]: I1012 21:01:01.295111 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29338381-pxljd" event={"ID":"d2812224-3ef8-431f-896d-01d9d78c3650","Type":"ContainerStarted","Data":"ca50937a3bf71d85836444e54bdfc9415a6284240e7d8aeeb230ac37f851bb99"} Oct 12 21:01:01 crc kubenswrapper[4773]: I1012 21:01:01.295174 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29338381-pxljd" event={"ID":"d2812224-3ef8-431f-896d-01d9d78c3650","Type":"ContainerStarted","Data":"4e3287b51be9bd086c31a7267c697748a1b8b2a1b256f32cf69a2c42df45fa7f"} Oct 12 21:01:01 crc kubenswrapper[4773]: I1012 21:01:01.314803 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29338381-pxljd" podStartSLOduration=1.314782567 podStartE2EDuration="1.314782567s" podCreationTimestamp="2025-10-12 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:01:01.311628159 +0000 UTC m=+2209.547926759" watchObservedRunningTime="2025-10-12 21:01:01.314782567 +0000 UTC m=+2209.551081127" Oct 12 21:01:04 crc kubenswrapper[4773]: I1012 21:01:04.323635 4773 generic.go:334] "Generic (PLEG): container finished" podID="d2812224-3ef8-431f-896d-01d9d78c3650" containerID="ca50937a3bf71d85836444e54bdfc9415a6284240e7d8aeeb230ac37f851bb99" exitCode=0 Oct 12 21:01:04 crc kubenswrapper[4773]: I1012 21:01:04.323815 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29338381-pxljd" event={"ID":"d2812224-3ef8-431f-896d-01d9d78c3650","Type":"ContainerDied","Data":"ca50937a3bf71d85836444e54bdfc9415a6284240e7d8aeeb230ac37f851bb99"} Oct 12 21:01:05 crc kubenswrapper[4773]: E1012 21:01:05.547363 4773 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.719575 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.831857 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-fernet-keys\") pod \"d2812224-3ef8-431f-896d-01d9d78c3650\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.831938 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-combined-ca-bundle\") pod \"d2812224-3ef8-431f-896d-01d9d78c3650\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.831976 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-config-data\") pod \"d2812224-3ef8-431f-896d-01d9d78c3650\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.832195 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbtl9\" (UniqueName: \"kubernetes.io/projected/d2812224-3ef8-431f-896d-01d9d78c3650-kube-api-access-mbtl9\") pod \"d2812224-3ef8-431f-896d-01d9d78c3650\" (UID: \"d2812224-3ef8-431f-896d-01d9d78c3650\") " Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.837952 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d2812224-3ef8-431f-896d-01d9d78c3650" (UID: "d2812224-3ef8-431f-896d-01d9d78c3650"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.838008 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2812224-3ef8-431f-896d-01d9d78c3650-kube-api-access-mbtl9" (OuterVolumeSpecName: "kube-api-access-mbtl9") pod "d2812224-3ef8-431f-896d-01d9d78c3650" (UID: "d2812224-3ef8-431f-896d-01d9d78c3650"). InnerVolumeSpecName "kube-api-access-mbtl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.861614 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2812224-3ef8-431f-896d-01d9d78c3650" (UID: "d2812224-3ef8-431f-896d-01d9d78c3650"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.880289 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-config-data" (OuterVolumeSpecName: "config-data") pod "d2812224-3ef8-431f-896d-01d9d78c3650" (UID: "d2812224-3ef8-431f-896d-01d9d78c3650"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.934391 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbtl9\" (UniqueName: \"kubernetes.io/projected/d2812224-3ef8-431f-896d-01d9d78c3650-kube-api-access-mbtl9\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.934577 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.934644 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:05 crc kubenswrapper[4773]: I1012 21:01:05.934705 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2812224-3ef8-431f-896d-01d9d78c3650-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:06 crc kubenswrapper[4773]: I1012 21:01:06.345066 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29338381-pxljd" event={"ID":"d2812224-3ef8-431f-896d-01d9d78c3650","Type":"ContainerDied","Data":"4e3287b51be9bd086c31a7267c697748a1b8b2a1b256f32cf69a2c42df45fa7f"} Oct 12 21:01:06 crc kubenswrapper[4773]: I1012 21:01:06.345529 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e3287b51be9bd086c31a7267c697748a1b8b2a1b256f32cf69a2c42df45fa7f" Oct 12 21:01:06 crc kubenswrapper[4773]: I1012 21:01:06.345146 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29338381-pxljd" Oct 12 21:01:12 crc kubenswrapper[4773]: I1012 21:01:12.397479 4773 generic.go:334] "Generic (PLEG): container finished" podID="25b3f977-6673-4aa8-aadc-89d98ceb7638" containerID="13130b97f94fe9337363410dbe470459a55b2612cbdac569ef597f976f057562" exitCode=0 Oct 12 21:01:12 crc kubenswrapper[4773]: I1012 21:01:12.397567 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" event={"ID":"25b3f977-6673-4aa8-aadc-89d98ceb7638","Type":"ContainerDied","Data":"13130b97f94fe9337363410dbe470459a55b2612cbdac569ef597f976f057562"} Oct 12 21:01:13 crc kubenswrapper[4773]: I1012 21:01:13.839617 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 21:01:13 crc kubenswrapper[4773]: I1012 21:01:13.987481 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ssh-key\") pod \"25b3f977-6673-4aa8-aadc-89d98ceb7638\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " Oct 12 21:01:13 crc kubenswrapper[4773]: I1012 21:01:13.987575 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-inventory\") pod \"25b3f977-6673-4aa8-aadc-89d98ceb7638\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " Oct 12 21:01:13 crc kubenswrapper[4773]: I1012 21:01:13.987641 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-bootstrap-combined-ca-bundle\") pod \"25b3f977-6673-4aa8-aadc-89d98ceb7638\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " Oct 12 21:01:13 crc kubenswrapper[4773]: I1012 21:01:13.987810 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ceph\") pod \"25b3f977-6673-4aa8-aadc-89d98ceb7638\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " Oct 12 21:01:13 crc kubenswrapper[4773]: I1012 21:01:13.987917 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l45s\" (UniqueName: \"kubernetes.io/projected/25b3f977-6673-4aa8-aadc-89d98ceb7638-kube-api-access-8l45s\") pod \"25b3f977-6673-4aa8-aadc-89d98ceb7638\" (UID: \"25b3f977-6673-4aa8-aadc-89d98ceb7638\") " Oct 12 21:01:13 crc kubenswrapper[4773]: I1012 21:01:13.993297 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ceph" (OuterVolumeSpecName: "ceph") pod "25b3f977-6673-4aa8-aadc-89d98ceb7638" (UID: "25b3f977-6673-4aa8-aadc-89d98ceb7638"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:13 crc kubenswrapper[4773]: I1012 21:01:13.995939 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b3f977-6673-4aa8-aadc-89d98ceb7638-kube-api-access-8l45s" (OuterVolumeSpecName: "kube-api-access-8l45s") pod "25b3f977-6673-4aa8-aadc-89d98ceb7638" (UID: "25b3f977-6673-4aa8-aadc-89d98ceb7638"). InnerVolumeSpecName "kube-api-access-8l45s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:01:13 crc kubenswrapper[4773]: I1012 21:01:13.998990 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "25b3f977-6673-4aa8-aadc-89d98ceb7638" (UID: "25b3f977-6673-4aa8-aadc-89d98ceb7638"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.029425 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "25b3f977-6673-4aa8-aadc-89d98ceb7638" (UID: "25b3f977-6673-4aa8-aadc-89d98ceb7638"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.031095 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-inventory" (OuterVolumeSpecName: "inventory") pod "25b3f977-6673-4aa8-aadc-89d98ceb7638" (UID: "25b3f977-6673-4aa8-aadc-89d98ceb7638"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.090270 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.090318 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l45s\" (UniqueName: \"kubernetes.io/projected/25b3f977-6673-4aa8-aadc-89d98ceb7638-kube-api-access-8l45s\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.090337 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.090354 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.090372 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b3f977-6673-4aa8-aadc-89d98ceb7638-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.422984 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" event={"ID":"25b3f977-6673-4aa8-aadc-89d98ceb7638","Type":"ContainerDied","Data":"2b0b84b31d52091e2ebfea6eeddb577e2ead6ec8aaa0850e2f1c443e28521097"} Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.423043 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b0b84b31d52091e2ebfea6eeddb577e2ead6ec8aaa0850e2f1c443e28521097" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.423090 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.518482 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst"] Oct 12 21:01:14 crc kubenswrapper[4773]: E1012 21:01:14.518901 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2812224-3ef8-431f-896d-01d9d78c3650" containerName="keystone-cron" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.518922 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2812224-3ef8-431f-896d-01d9d78c3650" containerName="keystone-cron" Oct 12 21:01:14 crc kubenswrapper[4773]: E1012 21:01:14.518959 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b3f977-6673-4aa8-aadc-89d98ceb7638" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.518969 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b3f977-6673-4aa8-aadc-89d98ceb7638" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.519153 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b3f977-6673-4aa8-aadc-89d98ceb7638" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.519182 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2812224-3ef8-431f-896d-01d9d78c3650" containerName="keystone-cron" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.519852 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.522906 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.523035 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.523530 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.528477 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.529250 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.535143 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst"] Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.600699 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.600814 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgrh\" (UniqueName: \"kubernetes.io/projected/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-kube-api-access-vxgrh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.600870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.600930 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.702292 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.702408 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.702585 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.702681 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgrh\" (UniqueName: \"kubernetes.io/projected/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-kube-api-access-vxgrh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.705857 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.706045 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.708677 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.720558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgrh\" (UniqueName: \"kubernetes.io/projected/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-kube-api-access-vxgrh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xzgst\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:14 crc kubenswrapper[4773]: I1012 21:01:14.837162 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:15 crc kubenswrapper[4773]: I1012 21:01:15.380194 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst"] Oct 12 21:01:15 crc kubenswrapper[4773]: I1012 21:01:15.430871 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" event={"ID":"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617","Type":"ContainerStarted","Data":"4ccc156d377cebcb28bc66116b95dc9b1b2aa95091cc2c8bcc28d991b89d5bff"} Oct 12 21:01:16 crc kubenswrapper[4773]: I1012 21:01:16.439446 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" event={"ID":"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617","Type":"ContainerStarted","Data":"d7e7fed1e4ca35d1a023c9c0e69dadf67d4c6a4a3aa3fb0939b3c5b66a583019"} Oct 12 21:01:16 crc kubenswrapper[4773]: I1012 21:01:16.465269 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" podStartSLOduration=1.906756329 podStartE2EDuration="2.465235357s" podCreationTimestamp="2025-10-12 21:01:14 +0000 UTC" firstStartedPulling="2025-10-12 21:01:15.395030083 +0000 UTC m=+2223.631328643" lastFinishedPulling="2025-10-12 21:01:15.953509121 +0000 UTC m=+2224.189807671" observedRunningTime="2025-10-12 21:01:16.454457557 +0000 UTC m=+2224.690756117" watchObservedRunningTime="2025-10-12 21:01:16.465235357 +0000 UTC m=+2224.701533957" Oct 12 21:01:28 crc kubenswrapper[4773]: I1012 21:01:28.670076 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:01:28 crc kubenswrapper[4773]: I1012 21:01:28.672541 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:01:28 crc kubenswrapper[4773]: I1012 21:01:28.672613 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 21:01:28 crc kubenswrapper[4773]: I1012 21:01:28.673816 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 21:01:28 crc kubenswrapper[4773]: I1012 21:01:28.673970 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" gracePeriod=600 Oct 12 21:01:29 crc kubenswrapper[4773]: E1012 21:01:29.343900 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:01:29 crc kubenswrapper[4773]: I1012 21:01:29.575618 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" exitCode=0 Oct 12 21:01:29 crc kubenswrapper[4773]: I1012 21:01:29.575671 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8"} Oct 12 21:01:29 crc kubenswrapper[4773]: I1012 21:01:29.575708 4773 scope.go:117] "RemoveContainer" containerID="c2b7161e75b51c032d700e35ee841759ad6f99f486fbe60853468eaca3289d2d" Oct 12 21:01:29 crc kubenswrapper[4773]: I1012 21:01:29.576490 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:01:29 crc kubenswrapper[4773]: E1012 21:01:29.576977 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:01:43 crc kubenswrapper[4773]: I1012 21:01:43.482052 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:01:43 crc kubenswrapper[4773]: E1012 21:01:43.483171 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:01:45 crc kubenswrapper[4773]: I1012 21:01:45.735640 4773 generic.go:334] "Generic (PLEG): container finished" podID="dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617" containerID="d7e7fed1e4ca35d1a023c9c0e69dadf67d4c6a4a3aa3fb0939b3c5b66a583019" exitCode=0 Oct 12 21:01:45 crc kubenswrapper[4773]: I1012 21:01:45.735810 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" event={"ID":"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617","Type":"ContainerDied","Data":"d7e7fed1e4ca35d1a023c9c0e69dadf67d4c6a4a3aa3fb0939b3c5b66a583019"} Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.301477 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.453607 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxgrh\" (UniqueName: \"kubernetes.io/projected/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-kube-api-access-vxgrh\") pod \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.453769 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-inventory\") pod \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.453887 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ceph\") pod \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.453926 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ssh-key\") pod \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\" (UID: \"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617\") " Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.459293 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-kube-api-access-vxgrh" (OuterVolumeSpecName: "kube-api-access-vxgrh") pod "dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617" (UID: "dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617"). InnerVolumeSpecName "kube-api-access-vxgrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.460130 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ceph" (OuterVolumeSpecName: "ceph") pod "dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617" (UID: "dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.483853 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-inventory" (OuterVolumeSpecName: "inventory") pod "dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617" (UID: "dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.486508 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617" (UID: "dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.556705 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxgrh\" (UniqueName: \"kubernetes.io/projected/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-kube-api-access-vxgrh\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.556760 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.556774 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.556785 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.754039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" event={"ID":"dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617","Type":"ContainerDied","Data":"4ccc156d377cebcb28bc66116b95dc9b1b2aa95091cc2c8bcc28d991b89d5bff"} Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.754321 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ccc156d377cebcb28bc66116b95dc9b1b2aa95091cc2c8bcc28d991b89d5bff" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.754247 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xzgst" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.845484 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j"] Oct 12 21:01:47 crc kubenswrapper[4773]: E1012 21:01:47.845875 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.845890 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.846135 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.846710 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.849964 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.850576 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.850926 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.851212 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.851328 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.861301 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wz57\" (UniqueName: \"kubernetes.io/projected/9ef8a23e-6501-4e90-a51c-0d57cee847af-kube-api-access-8wz57\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.861348 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.861513 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.861560 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.873845 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j"] Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.964357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.964437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.964503 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wz57\" (UniqueName: \"kubernetes.io/projected/9ef8a23e-6501-4e90-a51c-0d57cee847af-kube-api-access-8wz57\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.964540 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.969417 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.969433 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.973256 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:47 crc kubenswrapper[4773]: I1012 21:01:47.987646 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wz57\" (UniqueName: \"kubernetes.io/projected/9ef8a23e-6501-4e90-a51c-0d57cee847af-kube-api-access-8wz57\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:48 crc kubenswrapper[4773]: I1012 21:01:48.173446 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:48 crc kubenswrapper[4773]: I1012 21:01:48.741416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j"] Oct 12 21:01:48 crc kubenswrapper[4773]: I1012 21:01:48.792452 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" event={"ID":"9ef8a23e-6501-4e90-a51c-0d57cee847af","Type":"ContainerStarted","Data":"96b2a3b98f692c72c552d51351684bcef9e5e447d432bd0d6cbabd606d1d17a4"} Oct 12 21:01:49 crc kubenswrapper[4773]: I1012 21:01:49.802795 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" event={"ID":"9ef8a23e-6501-4e90-a51c-0d57cee847af","Type":"ContainerStarted","Data":"9cabcf1c129daf5a3e5f3e86e3e77972a9c2740c98981c3a374bc91dd90952d1"} Oct 12 21:01:54 crc kubenswrapper[4773]: I1012 21:01:54.487084 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:01:54 crc kubenswrapper[4773]: E1012 21:01:54.487828 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:01:55 crc kubenswrapper[4773]: I1012 21:01:55.878255 4773 generic.go:334] "Generic (PLEG): container finished" podID="9ef8a23e-6501-4e90-a51c-0d57cee847af" containerID="9cabcf1c129daf5a3e5f3e86e3e77972a9c2740c98981c3a374bc91dd90952d1" exitCode=0 Oct 12 21:01:55 crc kubenswrapper[4773]: I1012 21:01:55.878331 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" event={"ID":"9ef8a23e-6501-4e90-a51c-0d57cee847af","Type":"ContainerDied","Data":"9cabcf1c129daf5a3e5f3e86e3e77972a9c2740c98981c3a374bc91dd90952d1"} Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.281230 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.371157 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ssh-key\") pod \"9ef8a23e-6501-4e90-a51c-0d57cee847af\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.371333 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-inventory\") pod \"9ef8a23e-6501-4e90-a51c-0d57cee847af\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.371567 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wz57\" (UniqueName: \"kubernetes.io/projected/9ef8a23e-6501-4e90-a51c-0d57cee847af-kube-api-access-8wz57\") pod \"9ef8a23e-6501-4e90-a51c-0d57cee847af\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.371748 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ceph\") pod \"9ef8a23e-6501-4e90-a51c-0d57cee847af\" (UID: \"9ef8a23e-6501-4e90-a51c-0d57cee847af\") " Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.387392 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef8a23e-6501-4e90-a51c-0d57cee847af-kube-api-access-8wz57" (OuterVolumeSpecName: "kube-api-access-8wz57") pod "9ef8a23e-6501-4e90-a51c-0d57cee847af" (UID: "9ef8a23e-6501-4e90-a51c-0d57cee847af"). InnerVolumeSpecName "kube-api-access-8wz57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.387517 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ceph" (OuterVolumeSpecName: "ceph") pod "9ef8a23e-6501-4e90-a51c-0d57cee847af" (UID: "9ef8a23e-6501-4e90-a51c-0d57cee847af"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.395657 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9ef8a23e-6501-4e90-a51c-0d57cee847af" (UID: "9ef8a23e-6501-4e90-a51c-0d57cee847af"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.418613 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-inventory" (OuterVolumeSpecName: "inventory") pod "9ef8a23e-6501-4e90-a51c-0d57cee847af" (UID: "9ef8a23e-6501-4e90-a51c-0d57cee847af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.474355 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wz57\" (UniqueName: \"kubernetes.io/projected/9ef8a23e-6501-4e90-a51c-0d57cee847af-kube-api-access-8wz57\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.474388 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.474397 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.474405 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef8a23e-6501-4e90-a51c-0d57cee847af-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.904278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" event={"ID":"9ef8a23e-6501-4e90-a51c-0d57cee847af","Type":"ContainerDied","Data":"96b2a3b98f692c72c552d51351684bcef9e5e447d432bd0d6cbabd606d1d17a4"} Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.904371 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96b2a3b98f692c72c552d51351684bcef9e5e447d432bd0d6cbabd606d1d17a4" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.904310 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.993127 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz"] Oct 12 21:01:57 crc kubenswrapper[4773]: E1012 21:01:57.993538 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef8a23e-6501-4e90-a51c-0d57cee847af" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.993561 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef8a23e-6501-4e90-a51c-0d57cee847af" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.993824 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef8a23e-6501-4e90-a51c-0d57cee847af" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.994526 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:57 crc kubenswrapper[4773]: I1012 21:01:57.999750 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:57.999796 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:57.999808 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:57.999939 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.004079 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.008151 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz"] Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.085302 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js87r\" (UniqueName: \"kubernetes.io/projected/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-kube-api-access-js87r\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.085370 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.085431 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.085457 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.186528 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js87r\" (UniqueName: \"kubernetes.io/projected/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-kube-api-access-js87r\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.186609 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.186662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.186687 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.190079 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.190602 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.192748 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.201671 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js87r\" (UniqueName: \"kubernetes.io/projected/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-kube-api-access-js87r\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9wz\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.311905 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.872043 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz"] Oct 12 21:01:58 crc kubenswrapper[4773]: I1012 21:01:58.920372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" event={"ID":"4d530e38-f79d-4b93-9d2a-ad94eddb69b1","Type":"ContainerStarted","Data":"761a18ebb953f9d98694e5a1e8c157ac2a01475536fe9ed0ae1a3f70ff9a8860"} Oct 12 21:01:59 crc kubenswrapper[4773]: I1012 21:01:59.927867 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" event={"ID":"4d530e38-f79d-4b93-9d2a-ad94eddb69b1","Type":"ContainerStarted","Data":"459a779ae01834bd08fcdea90cdaa72acb1746b58921ecf67ea64b8bc4746b04"} Oct 12 21:01:59 crc kubenswrapper[4773]: I1012 21:01:59.945489 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" podStartSLOduration=2.532110503 podStartE2EDuration="2.945469771s" podCreationTimestamp="2025-10-12 21:01:57 +0000 UTC" firstStartedPulling="2025-10-12 21:01:58.884537344 +0000 UTC m=+2267.120835904" lastFinishedPulling="2025-10-12 21:01:59.297896592 +0000 UTC m=+2267.534195172" observedRunningTime="2025-10-12 21:01:59.939552466 +0000 UTC m=+2268.175851026" watchObservedRunningTime="2025-10-12 21:01:59.945469771 +0000 UTC m=+2268.181768331" Oct 12 21:02:09 crc kubenswrapper[4773]: I1012 21:02:09.481597 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:02:09 crc kubenswrapper[4773]: E1012 21:02:09.482322 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:02:20 crc kubenswrapper[4773]: I1012 21:02:20.481508 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:02:20 crc kubenswrapper[4773]: E1012 21:02:20.482409 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:02:35 crc kubenswrapper[4773]: I1012 21:02:35.483886 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:02:35 crc kubenswrapper[4773]: E1012 21:02:35.485542 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:02:46 crc kubenswrapper[4773]: I1012 21:02:46.377588 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d530e38-f79d-4b93-9d2a-ad94eddb69b1" containerID="459a779ae01834bd08fcdea90cdaa72acb1746b58921ecf67ea64b8bc4746b04" exitCode=0 Oct 12 21:02:46 crc kubenswrapper[4773]: I1012 21:02:46.377770 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" event={"ID":"4d530e38-f79d-4b93-9d2a-ad94eddb69b1","Type":"ContainerDied","Data":"459a779ae01834bd08fcdea90cdaa72acb1746b58921ecf67ea64b8bc4746b04"} Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.786067 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.869394 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ceph\") pod \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.869485 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-inventory\") pod \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.869595 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js87r\" (UniqueName: \"kubernetes.io/projected/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-kube-api-access-js87r\") pod \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.869659 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ssh-key\") pod \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\" (UID: \"4d530e38-f79d-4b93-9d2a-ad94eddb69b1\") " Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.875087 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-kube-api-access-js87r" (OuterVolumeSpecName: "kube-api-access-js87r") pod "4d530e38-f79d-4b93-9d2a-ad94eddb69b1" (UID: "4d530e38-f79d-4b93-9d2a-ad94eddb69b1"). InnerVolumeSpecName "kube-api-access-js87r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.878019 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ceph" (OuterVolumeSpecName: "ceph") pod "4d530e38-f79d-4b93-9d2a-ad94eddb69b1" (UID: "4d530e38-f79d-4b93-9d2a-ad94eddb69b1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.900608 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4d530e38-f79d-4b93-9d2a-ad94eddb69b1" (UID: "4d530e38-f79d-4b93-9d2a-ad94eddb69b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.907445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-inventory" (OuterVolumeSpecName: "inventory") pod "4d530e38-f79d-4b93-9d2a-ad94eddb69b1" (UID: "4d530e38-f79d-4b93-9d2a-ad94eddb69b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.972451 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.972478 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.972491 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js87r\" (UniqueName: \"kubernetes.io/projected/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-kube-api-access-js87r\") on node \"crc\" DevicePath \"\"" Oct 12 21:02:47 crc kubenswrapper[4773]: I1012 21:02:47.972502 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d530e38-f79d-4b93-9d2a-ad94eddb69b1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.397575 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" event={"ID":"4d530e38-f79d-4b93-9d2a-ad94eddb69b1","Type":"ContainerDied","Data":"761a18ebb953f9d98694e5a1e8c157ac2a01475536fe9ed0ae1a3f70ff9a8860"} Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.397919 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="761a18ebb953f9d98694e5a1e8c157ac2a01475536fe9ed0ae1a3f70ff9a8860" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.397874 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9wz" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.481678 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:02:48 crc kubenswrapper[4773]: E1012 21:02:48.481961 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.494206 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q"] Oct 12 21:02:48 crc kubenswrapper[4773]: E1012 21:02:48.494595 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d530e38-f79d-4b93-9d2a-ad94eddb69b1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.494613 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d530e38-f79d-4b93-9d2a-ad94eddb69b1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.494810 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d530e38-f79d-4b93-9d2a-ad94eddb69b1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.495553 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.497158 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.501931 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.502132 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.502262 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.502392 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.509854 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q"] Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.583306 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.583369 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8rhl\" (UniqueName: \"kubernetes.io/projected/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-kube-api-access-b8rhl\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.583391 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.583408 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.685062 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.685119 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8rhl\" (UniqueName: \"kubernetes.io/projected/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-kube-api-access-b8rhl\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.685162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.685182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.689434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.689543 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.695539 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.701076 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8rhl\" (UniqueName: \"kubernetes.io/projected/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-kube-api-access-b8rhl\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:48 crc kubenswrapper[4773]: I1012 21:02:48.817433 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:49 crc kubenswrapper[4773]: I1012 21:02:49.308590 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q"] Oct 12 21:02:49 crc kubenswrapper[4773]: I1012 21:02:49.406236 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" event={"ID":"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2","Type":"ContainerStarted","Data":"e4d11e575e3dff2b7b743a24e5e5eb41c1cf1141a76414dd986a8fb639221f24"} Oct 12 21:02:50 crc kubenswrapper[4773]: I1012 21:02:50.415610 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" event={"ID":"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2","Type":"ContainerStarted","Data":"0da7735c203121359d968640b11a34233bba51bf6e28003f936e3bb715e667f0"} Oct 12 21:02:54 crc kubenswrapper[4773]: I1012 21:02:54.447475 4773 generic.go:334] "Generic (PLEG): container finished" podID="9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2" containerID="0da7735c203121359d968640b11a34233bba51bf6e28003f936e3bb715e667f0" exitCode=0 Oct 12 21:02:54 crc kubenswrapper[4773]: I1012 21:02:54.447687 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" event={"ID":"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2","Type":"ContainerDied","Data":"0da7735c203121359d968640b11a34233bba51bf6e28003f936e3bb715e667f0"} Oct 12 21:02:55 crc kubenswrapper[4773]: I1012 21:02:55.923902 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.049494 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8rhl\" (UniqueName: \"kubernetes.io/projected/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-kube-api-access-b8rhl\") pod \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.049954 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ssh-key\") pod \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.050033 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ceph\") pod \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.050102 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-inventory\") pod \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\" (UID: \"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2\") " Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.054529 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ceph" (OuterVolumeSpecName: "ceph") pod "9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2" (UID: "9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.057493 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-kube-api-access-b8rhl" (OuterVolumeSpecName: "kube-api-access-b8rhl") pod "9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2" (UID: "9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2"). InnerVolumeSpecName "kube-api-access-b8rhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.077898 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2" (UID: "9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.080474 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-inventory" (OuterVolumeSpecName: "inventory") pod "9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2" (UID: "9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.152809 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.152863 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8rhl\" (UniqueName: \"kubernetes.io/projected/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-kube-api-access-b8rhl\") on node \"crc\" DevicePath \"\"" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.152885 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.152904 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.466540 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" event={"ID":"9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2","Type":"ContainerDied","Data":"e4d11e575e3dff2b7b743a24e5e5eb41c1cf1141a76414dd986a8fb639221f24"} Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.466598 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4d11e575e3dff2b7b743a24e5e5eb41c1cf1141a76414dd986a8fb639221f24" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.466672 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.547426 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9"] Oct 12 21:02:56 crc kubenswrapper[4773]: E1012 21:02:56.547844 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.547861 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.548071 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.548761 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.551814 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.552001 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.552220 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.555355 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.556297 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9"] Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.558388 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.591031 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.591080 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.591120 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vqqs\" (UniqueName: \"kubernetes.io/projected/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-kube-api-access-8vqqs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.591236 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.693059 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.693154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.693177 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.693209 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vqqs\" (UniqueName: \"kubernetes.io/projected/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-kube-api-access-8vqqs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.697943 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.703325 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.712066 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.724847 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vqqs\" (UniqueName: \"kubernetes.io/projected/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-kube-api-access-8vqqs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f77w9\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:56 crc kubenswrapper[4773]: I1012 21:02:56.903036 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:02:57 crc kubenswrapper[4773]: I1012 21:02:57.434357 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9"] Oct 12 21:02:57 crc kubenswrapper[4773]: I1012 21:02:57.476401 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" event={"ID":"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1","Type":"ContainerStarted","Data":"8cbedf5d4e0c0ea3c39db67a78d0d986082a69cf58f9764d092012b0388f61e1"} Oct 12 21:02:58 crc kubenswrapper[4773]: I1012 21:02:58.489835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" event={"ID":"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1","Type":"ContainerStarted","Data":"3253e1e47642ed4878500c21c743dd0c06043a9ec4ea4c191f1b33ea9040f91f"} Oct 12 21:02:58 crc kubenswrapper[4773]: I1012 21:02:58.512358 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" podStartSLOduration=2.086018728 podStartE2EDuration="2.512330215s" podCreationTimestamp="2025-10-12 21:02:56 +0000 UTC" firstStartedPulling="2025-10-12 21:02:57.444763995 +0000 UTC m=+2325.681062565" lastFinishedPulling="2025-10-12 21:02:57.871075482 +0000 UTC m=+2326.107374052" observedRunningTime="2025-10-12 21:02:58.503094178 +0000 UTC m=+2326.739392778" watchObservedRunningTime="2025-10-12 21:02:58.512330215 +0000 UTC m=+2326.748628795" Oct 12 21:03:02 crc kubenswrapper[4773]: I1012 21:03:02.489090 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:03:02 crc kubenswrapper[4773]: E1012 21:03:02.490113 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:03:16 crc kubenswrapper[4773]: I1012 21:03:16.481385 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:03:16 crc kubenswrapper[4773]: E1012 21:03:16.482308 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:03:31 crc kubenswrapper[4773]: I1012 21:03:31.481907 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:03:31 crc kubenswrapper[4773]: E1012 21:03:31.482793 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:03:44 crc kubenswrapper[4773]: I1012 21:03:44.481201 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:03:44 crc kubenswrapper[4773]: E1012 21:03:44.482222 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:03:54 crc kubenswrapper[4773]: I1012 21:03:54.995150 4773 generic.go:334] "Generic (PLEG): container finished" podID="17284681-e0c1-42f8-8ee2-2b3f8e73e6d1" containerID="3253e1e47642ed4878500c21c743dd0c06043a9ec4ea4c191f1b33ea9040f91f" exitCode=0 Oct 12 21:03:54 crc kubenswrapper[4773]: I1012 21:03:54.995253 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" event={"ID":"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1","Type":"ContainerDied","Data":"3253e1e47642ed4878500c21c743dd0c06043a9ec4ea4c191f1b33ea9040f91f"} Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.436415 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.583620 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ssh-key\") pod \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.583839 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vqqs\" (UniqueName: \"kubernetes.io/projected/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-kube-api-access-8vqqs\") pod \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.583939 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ceph\") pod \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.584011 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-inventory\") pod \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\" (UID: \"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1\") " Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.590784 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-kube-api-access-8vqqs" (OuterVolumeSpecName: "kube-api-access-8vqqs") pod "17284681-e0c1-42f8-8ee2-2b3f8e73e6d1" (UID: "17284681-e0c1-42f8-8ee2-2b3f8e73e6d1"). InnerVolumeSpecName "kube-api-access-8vqqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.591250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ceph" (OuterVolumeSpecName: "ceph") pod "17284681-e0c1-42f8-8ee2-2b3f8e73e6d1" (UID: "17284681-e0c1-42f8-8ee2-2b3f8e73e6d1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.615527 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17284681-e0c1-42f8-8ee2-2b3f8e73e6d1" (UID: "17284681-e0c1-42f8-8ee2-2b3f8e73e6d1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.615793 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-inventory" (OuterVolumeSpecName: "inventory") pod "17284681-e0c1-42f8-8ee2-2b3f8e73e6d1" (UID: "17284681-e0c1-42f8-8ee2-2b3f8e73e6d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.686917 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.687076 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.687158 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:03:56 crc kubenswrapper[4773]: I1012 21:03:56.687243 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vqqs\" (UniqueName: \"kubernetes.io/projected/17284681-e0c1-42f8-8ee2-2b3f8e73e6d1-kube-api-access-8vqqs\") on node \"crc\" DevicePath \"\"" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.013616 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" event={"ID":"17284681-e0c1-42f8-8ee2-2b3f8e73e6d1","Type":"ContainerDied","Data":"8cbedf5d4e0c0ea3c39db67a78d0d986082a69cf58f9764d092012b0388f61e1"} Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.013944 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cbedf5d4e0c0ea3c39db67a78d0d986082a69cf58f9764d092012b0388f61e1" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.013700 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f77w9" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.118496 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sdsnz"] Oct 12 21:03:57 crc kubenswrapper[4773]: E1012 21:03:57.119183 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17284681-e0c1-42f8-8ee2-2b3f8e73e6d1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.119292 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="17284681-e0c1-42f8-8ee2-2b3f8e73e6d1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.119614 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="17284681-e0c1-42f8-8ee2-2b3f8e73e6d1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.120394 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.122696 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.122983 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.123273 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.123507 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.127280 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.135610 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sdsnz"] Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.299788 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.299889 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7xvc\" (UniqueName: \"kubernetes.io/projected/8c71487d-25fd-480c-90ca-4ca43f86a247-kube-api-access-d7xvc\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.299931 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.300117 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ceph\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.402185 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7xvc\" (UniqueName: \"kubernetes.io/projected/8c71487d-25fd-480c-90ca-4ca43f86a247-kube-api-access-d7xvc\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.402259 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.402308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ceph\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.402365 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.406077 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ceph\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.407114 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.415650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.423599 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7xvc\" (UniqueName: \"kubernetes.io/projected/8c71487d-25fd-480c-90ca-4ca43f86a247-kube-api-access-d7xvc\") pod \"ssh-known-hosts-edpm-deployment-sdsnz\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.470740 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:03:57 crc kubenswrapper[4773]: I1012 21:03:57.972063 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sdsnz"] Oct 12 21:03:58 crc kubenswrapper[4773]: I1012 21:03:58.030567 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" event={"ID":"8c71487d-25fd-480c-90ca-4ca43f86a247","Type":"ContainerStarted","Data":"864bd7534cba8087740db0fd36cad8e587fc297cd7330bc048601aa855a46fe8"} Oct 12 21:03:58 crc kubenswrapper[4773]: I1012 21:03:58.481563 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:03:58 crc kubenswrapper[4773]: E1012 21:03:58.482248 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:03:59 crc kubenswrapper[4773]: I1012 21:03:59.042692 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" event={"ID":"8c71487d-25fd-480c-90ca-4ca43f86a247","Type":"ContainerStarted","Data":"6887a8d5ecf47f599c5e5c2d89137894914ddb2f9ed37c904bd77fc3d9039b43"} Oct 12 21:03:59 crc kubenswrapper[4773]: I1012 21:03:59.077576 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" podStartSLOduration=1.5514874619999999 podStartE2EDuration="2.077548318s" podCreationTimestamp="2025-10-12 21:03:57 +0000 UTC" firstStartedPulling="2025-10-12 21:03:57.980962179 +0000 UTC m=+2386.217260759" lastFinishedPulling="2025-10-12 21:03:58.507023015 +0000 UTC m=+2386.743321615" observedRunningTime="2025-10-12 21:03:59.063440055 +0000 UTC m=+2387.299738655" watchObservedRunningTime="2025-10-12 21:03:59.077548318 +0000 UTC m=+2387.313846908" Oct 12 21:04:10 crc kubenswrapper[4773]: I1012 21:04:10.157935 4773 generic.go:334] "Generic (PLEG): container finished" podID="8c71487d-25fd-480c-90ca-4ca43f86a247" containerID="6887a8d5ecf47f599c5e5c2d89137894914ddb2f9ed37c904bd77fc3d9039b43" exitCode=0 Oct 12 21:04:10 crc kubenswrapper[4773]: I1012 21:04:10.158014 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" event={"ID":"8c71487d-25fd-480c-90ca-4ca43f86a247","Type":"ContainerDied","Data":"6887a8d5ecf47f599c5e5c2d89137894914ddb2f9ed37c904bd77fc3d9039b43"} Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.481849 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:04:11 crc kubenswrapper[4773]: E1012 21:04:11.482451 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.555133 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.678226 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ssh-key-openstack-edpm-ipam\") pod \"8c71487d-25fd-480c-90ca-4ca43f86a247\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.678423 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ceph\") pod \"8c71487d-25fd-480c-90ca-4ca43f86a247\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.678508 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-inventory-0\") pod \"8c71487d-25fd-480c-90ca-4ca43f86a247\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.678556 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7xvc\" (UniqueName: \"kubernetes.io/projected/8c71487d-25fd-480c-90ca-4ca43f86a247-kube-api-access-d7xvc\") pod \"8c71487d-25fd-480c-90ca-4ca43f86a247\" (UID: \"8c71487d-25fd-480c-90ca-4ca43f86a247\") " Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.683699 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c71487d-25fd-480c-90ca-4ca43f86a247-kube-api-access-d7xvc" (OuterVolumeSpecName: "kube-api-access-d7xvc") pod "8c71487d-25fd-480c-90ca-4ca43f86a247" (UID: "8c71487d-25fd-480c-90ca-4ca43f86a247"). InnerVolumeSpecName "kube-api-access-d7xvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.684304 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ceph" (OuterVolumeSpecName: "ceph") pod "8c71487d-25fd-480c-90ca-4ca43f86a247" (UID: "8c71487d-25fd-480c-90ca-4ca43f86a247"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.701859 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8c71487d-25fd-480c-90ca-4ca43f86a247" (UID: "8c71487d-25fd-480c-90ca-4ca43f86a247"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.708099 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8c71487d-25fd-480c-90ca-4ca43f86a247" (UID: "8c71487d-25fd-480c-90ca-4ca43f86a247"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.781081 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.781148 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.781180 4773 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8c71487d-25fd-480c-90ca-4ca43f86a247-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:11 crc kubenswrapper[4773]: I1012 21:04:11.781209 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7xvc\" (UniqueName: \"kubernetes.io/projected/8c71487d-25fd-480c-90ca-4ca43f86a247-kube-api-access-d7xvc\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.178179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" event={"ID":"8c71487d-25fd-480c-90ca-4ca43f86a247","Type":"ContainerDied","Data":"864bd7534cba8087740db0fd36cad8e587fc297cd7330bc048601aa855a46fe8"} Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.178728 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="864bd7534cba8087740db0fd36cad8e587fc297cd7330bc048601aa855a46fe8" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.178267 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sdsnz" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.287203 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl"] Oct 12 21:04:12 crc kubenswrapper[4773]: E1012 21:04:12.287588 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c71487d-25fd-480c-90ca-4ca43f86a247" containerName="ssh-known-hosts-edpm-deployment" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.287607 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c71487d-25fd-480c-90ca-4ca43f86a247" containerName="ssh-known-hosts-edpm-deployment" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.287870 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c71487d-25fd-480c-90ca-4ca43f86a247" containerName="ssh-known-hosts-edpm-deployment" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.288601 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.292122 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.292376 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.292540 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.292674 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.292841 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.304043 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl"] Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.405055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.405346 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.405459 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6f6q\" (UniqueName: \"kubernetes.io/projected/83cb532a-174c-41c0-a271-95a66d439f0c-kube-api-access-q6f6q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.405567 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.507167 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6f6q\" (UniqueName: \"kubernetes.io/projected/83cb532a-174c-41c0-a271-95a66d439f0c-kube-api-access-q6f6q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.507744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.508017 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.508140 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.509984 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.510708 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.512401 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.523434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.526422 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.526447 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.528589 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6f6q\" (UniqueName: \"kubernetes.io/projected/83cb532a-174c-41c0-a271-95a66d439f0c-kube-api-access-q6f6q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wptgl\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.613938 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:04:12 crc kubenswrapper[4773]: I1012 21:04:12.621899 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:13 crc kubenswrapper[4773]: I1012 21:04:13.165762 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl"] Oct 12 21:04:13 crc kubenswrapper[4773]: W1012 21:04:13.177043 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83cb532a_174c_41c0_a271_95a66d439f0c.slice/crio-b57cfd9fec8b06fd79121aafd8d25cbb6a7e70968f26b9ca9643e506d12e6612 WatchSource:0}: Error finding container b57cfd9fec8b06fd79121aafd8d25cbb6a7e70968f26b9ca9643e506d12e6612: Status 404 returned error can't find the container with id b57cfd9fec8b06fd79121aafd8d25cbb6a7e70968f26b9ca9643e506d12e6612 Oct 12 21:04:13 crc kubenswrapper[4773]: I1012 21:04:13.181779 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 21:04:13 crc kubenswrapper[4773]: I1012 21:04:13.195566 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" event={"ID":"83cb532a-174c-41c0-a271-95a66d439f0c","Type":"ContainerStarted","Data":"b57cfd9fec8b06fd79121aafd8d25cbb6a7e70968f26b9ca9643e506d12e6612"} Oct 12 21:04:13 crc kubenswrapper[4773]: I1012 21:04:13.714980 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:04:14 crc kubenswrapper[4773]: I1012 21:04:14.206257 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" event={"ID":"83cb532a-174c-41c0-a271-95a66d439f0c","Type":"ContainerStarted","Data":"0593e326e13a26376143266082bde3cc31685f92da459837d81c5efa0487fa85"} Oct 12 21:04:14 crc kubenswrapper[4773]: I1012 21:04:14.232930 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" podStartSLOduration=1.702240899 podStartE2EDuration="2.232905352s" podCreationTimestamp="2025-10-12 21:04:12 +0000 UTC" firstStartedPulling="2025-10-12 21:04:13.181227156 +0000 UTC m=+2401.417525756" lastFinishedPulling="2025-10-12 21:04:13.711891609 +0000 UTC m=+2401.948190209" observedRunningTime="2025-10-12 21:04:14.229430238 +0000 UTC m=+2402.465728828" watchObservedRunningTime="2025-10-12 21:04:14.232905352 +0000 UTC m=+2402.469203922" Oct 12 21:04:22 crc kubenswrapper[4773]: I1012 21:04:22.275121 4773 generic.go:334] "Generic (PLEG): container finished" podID="83cb532a-174c-41c0-a271-95a66d439f0c" containerID="0593e326e13a26376143266082bde3cc31685f92da459837d81c5efa0487fa85" exitCode=0 Oct 12 21:04:22 crc kubenswrapper[4773]: I1012 21:04:22.275218 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" event={"ID":"83cb532a-174c-41c0-a271-95a66d439f0c","Type":"ContainerDied","Data":"0593e326e13a26376143266082bde3cc31685f92da459837d81c5efa0487fa85"} Oct 12 21:04:22 crc kubenswrapper[4773]: I1012 21:04:22.486460 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:04:22 crc kubenswrapper[4773]: E1012 21:04:22.487261 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.710426 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.832863 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ceph\") pod \"83cb532a-174c-41c0-a271-95a66d439f0c\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.832921 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ssh-key\") pod \"83cb532a-174c-41c0-a271-95a66d439f0c\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.832956 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-inventory\") pod \"83cb532a-174c-41c0-a271-95a66d439f0c\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.833085 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6f6q\" (UniqueName: \"kubernetes.io/projected/83cb532a-174c-41c0-a271-95a66d439f0c-kube-api-access-q6f6q\") pod \"83cb532a-174c-41c0-a271-95a66d439f0c\" (UID: \"83cb532a-174c-41c0-a271-95a66d439f0c\") " Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.841525 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ceph" (OuterVolumeSpecName: "ceph") pod "83cb532a-174c-41c0-a271-95a66d439f0c" (UID: "83cb532a-174c-41c0-a271-95a66d439f0c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.841947 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83cb532a-174c-41c0-a271-95a66d439f0c-kube-api-access-q6f6q" (OuterVolumeSpecName: "kube-api-access-q6f6q") pod "83cb532a-174c-41c0-a271-95a66d439f0c" (UID: "83cb532a-174c-41c0-a271-95a66d439f0c"). InnerVolumeSpecName "kube-api-access-q6f6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.862582 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "83cb532a-174c-41c0-a271-95a66d439f0c" (UID: "83cb532a-174c-41c0-a271-95a66d439f0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.863844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-inventory" (OuterVolumeSpecName: "inventory") pod "83cb532a-174c-41c0-a271-95a66d439f0c" (UID: "83cb532a-174c-41c0-a271-95a66d439f0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.935387 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6f6q\" (UniqueName: \"kubernetes.io/projected/83cb532a-174c-41c0-a271-95a66d439f0c-kube-api-access-q6f6q\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.935415 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.935424 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:23 crc kubenswrapper[4773]: I1012 21:04:23.935432 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83cb532a-174c-41c0-a271-95a66d439f0c-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.296618 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" event={"ID":"83cb532a-174c-41c0-a271-95a66d439f0c","Type":"ContainerDied","Data":"b57cfd9fec8b06fd79121aafd8d25cbb6a7e70968f26b9ca9643e506d12e6612"} Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.296928 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b57cfd9fec8b06fd79121aafd8d25cbb6a7e70968f26b9ca9643e506d12e6612" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.296831 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wptgl" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.442829 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565"] Oct 12 21:04:24 crc kubenswrapper[4773]: E1012 21:04:24.443360 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83cb532a-174c-41c0-a271-95a66d439f0c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.443377 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="83cb532a-174c-41c0-a271-95a66d439f0c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.443559 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="83cb532a-174c-41c0-a271-95a66d439f0c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.444166 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.447081 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.447148 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.447167 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.447198 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.447253 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.447283 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2fl\" (UniqueName: \"kubernetes.io/projected/073e807a-4708-4b50-abf6-f66668e13e8e-kube-api-access-bm2fl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.447332 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.447478 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.449291 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.461654 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565"] Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.549273 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.549414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm2fl\" (UniqueName: \"kubernetes.io/projected/073e807a-4708-4b50-abf6-f66668e13e8e-kube-api-access-bm2fl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.549478 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.549559 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.554281 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.554585 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.555536 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.565224 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm2fl\" (UniqueName: \"kubernetes.io/projected/073e807a-4708-4b50-abf6-f66668e13e8e-kube-api-access-bm2fl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ch565\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.765602 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.767294 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dktnk"] Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.771140 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.779322 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dktnk"] Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.857276 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnv94\" (UniqueName: \"kubernetes.io/projected/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-kube-api-access-bnv94\") pod \"redhat-operators-dktnk\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.857560 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-catalog-content\") pod \"redhat-operators-dktnk\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.858199 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-utilities\") pod \"redhat-operators-dktnk\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.959953 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-catalog-content\") pod \"redhat-operators-dktnk\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.960416 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-utilities\") pod \"redhat-operators-dktnk\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.960504 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnv94\" (UniqueName: \"kubernetes.io/projected/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-kube-api-access-bnv94\") pod \"redhat-operators-dktnk\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.961557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-catalog-content\") pod \"redhat-operators-dktnk\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.961810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-utilities\") pod \"redhat-operators-dktnk\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:24 crc kubenswrapper[4773]: I1012 21:04:24.992762 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnv94\" (UniqueName: \"kubernetes.io/projected/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-kube-api-access-bnv94\") pod \"redhat-operators-dktnk\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:25 crc kubenswrapper[4773]: I1012 21:04:25.175161 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:25 crc kubenswrapper[4773]: I1012 21:04:25.378572 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565"] Oct 12 21:04:25 crc kubenswrapper[4773]: I1012 21:04:25.604470 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dktnk"] Oct 12 21:04:26 crc kubenswrapper[4773]: I1012 21:04:26.317451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" event={"ID":"073e807a-4708-4b50-abf6-f66668e13e8e","Type":"ContainerStarted","Data":"bf1d6a78d376cf9f4a1bfb20daec16d027a3fd7ae9c8a9d3872c76575dd59299"} Oct 12 21:04:26 crc kubenswrapper[4773]: I1012 21:04:26.317839 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" event={"ID":"073e807a-4708-4b50-abf6-f66668e13e8e","Type":"ContainerStarted","Data":"4e91a6d0e617a8b65862594e2dfb36d28ff686f46c58c72cf060023df69aa870"} Oct 12 21:04:26 crc kubenswrapper[4773]: I1012 21:04:26.321044 4773 generic.go:334] "Generic (PLEG): container finished" podID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerID="50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96" exitCode=0 Oct 12 21:04:26 crc kubenswrapper[4773]: I1012 21:04:26.321080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dktnk" event={"ID":"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4","Type":"ContainerDied","Data":"50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96"} Oct 12 21:04:26 crc kubenswrapper[4773]: I1012 21:04:26.321099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dktnk" event={"ID":"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4","Type":"ContainerStarted","Data":"cdd994838be1601258d27084da31aff089d789bc5a1c59ddc312288c41d65c29"} Oct 12 21:04:26 crc kubenswrapper[4773]: I1012 21:04:26.348247 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" podStartSLOduration=1.941142594 podStartE2EDuration="2.34822967s" podCreationTimestamp="2025-10-12 21:04:24 +0000 UTC" firstStartedPulling="2025-10-12 21:04:25.399172946 +0000 UTC m=+2413.635471496" lastFinishedPulling="2025-10-12 21:04:25.806260012 +0000 UTC m=+2414.042558572" observedRunningTime="2025-10-12 21:04:26.336327689 +0000 UTC m=+2414.572626249" watchObservedRunningTime="2025-10-12 21:04:26.34822967 +0000 UTC m=+2414.584528230" Oct 12 21:04:28 crc kubenswrapper[4773]: I1012 21:04:28.353267 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dktnk" event={"ID":"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4","Type":"ContainerStarted","Data":"03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f"} Oct 12 21:04:31 crc kubenswrapper[4773]: I1012 21:04:31.378839 4773 generic.go:334] "Generic (PLEG): container finished" podID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerID="03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f" exitCode=0 Oct 12 21:04:31 crc kubenswrapper[4773]: I1012 21:04:31.378911 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dktnk" event={"ID":"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4","Type":"ContainerDied","Data":"03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f"} Oct 12 21:04:32 crc kubenswrapper[4773]: I1012 21:04:32.401961 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dktnk" event={"ID":"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4","Type":"ContainerStarted","Data":"d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca"} Oct 12 21:04:32 crc kubenswrapper[4773]: I1012 21:04:32.429279 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dktnk" podStartSLOduration=2.92212455 podStartE2EDuration="8.429262384s" podCreationTimestamp="2025-10-12 21:04:24 +0000 UTC" firstStartedPulling="2025-10-12 21:04:26.322579077 +0000 UTC m=+2414.558877627" lastFinishedPulling="2025-10-12 21:04:31.829716881 +0000 UTC m=+2420.066015461" observedRunningTime="2025-10-12 21:04:32.424071934 +0000 UTC m=+2420.660370494" watchObservedRunningTime="2025-10-12 21:04:32.429262384 +0000 UTC m=+2420.665560944" Oct 12 21:04:35 crc kubenswrapper[4773]: I1012 21:04:35.175497 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:35 crc kubenswrapper[4773]: I1012 21:04:35.175868 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:04:36 crc kubenswrapper[4773]: I1012 21:04:36.222176 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dktnk" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="registry-server" probeResult="failure" output=< Oct 12 21:04:36 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:04:36 crc kubenswrapper[4773]: > Oct 12 21:04:36 crc kubenswrapper[4773]: I1012 21:04:36.481046 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:04:36 crc kubenswrapper[4773]: E1012 21:04:36.481424 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:04:37 crc kubenswrapper[4773]: I1012 21:04:37.445455 4773 generic.go:334] "Generic (PLEG): container finished" podID="073e807a-4708-4b50-abf6-f66668e13e8e" containerID="bf1d6a78d376cf9f4a1bfb20daec16d027a3fd7ae9c8a9d3872c76575dd59299" exitCode=0 Oct 12 21:04:37 crc kubenswrapper[4773]: I1012 21:04:37.445605 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" event={"ID":"073e807a-4708-4b50-abf6-f66668e13e8e","Type":"ContainerDied","Data":"bf1d6a78d376cf9f4a1bfb20daec16d027a3fd7ae9c8a9d3872c76575dd59299"} Oct 12 21:04:38 crc kubenswrapper[4773]: I1012 21:04:38.924424 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:38 crc kubenswrapper[4773]: I1012 21:04:38.964963 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zvvpt"] Oct 12 21:04:38 crc kubenswrapper[4773]: E1012 21:04:38.965588 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073e807a-4708-4b50-abf6-f66668e13e8e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:04:38 crc kubenswrapper[4773]: I1012 21:04:38.965608 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="073e807a-4708-4b50-abf6-f66668e13e8e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:04:38 crc kubenswrapper[4773]: I1012 21:04:38.965838 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="073e807a-4708-4b50-abf6-f66668e13e8e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 21:04:38 crc kubenswrapper[4773]: I1012 21:04:38.967252 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:38 crc kubenswrapper[4773]: I1012 21:04:38.978861 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zvvpt"] Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.021067 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ceph\") pod \"073e807a-4708-4b50-abf6-f66668e13e8e\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.021117 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-inventory\") pod \"073e807a-4708-4b50-abf6-f66668e13e8e\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.021144 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm2fl\" (UniqueName: \"kubernetes.io/projected/073e807a-4708-4b50-abf6-f66668e13e8e-kube-api-access-bm2fl\") pod \"073e807a-4708-4b50-abf6-f66668e13e8e\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.021182 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ssh-key\") pod \"073e807a-4708-4b50-abf6-f66668e13e8e\" (UID: \"073e807a-4708-4b50-abf6-f66668e13e8e\") " Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.021321 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4s82\" (UniqueName: \"kubernetes.io/projected/21e6825f-ed92-4bac-9093-d551fa6d17e2-kube-api-access-l4s82\") pod \"community-operators-zvvpt\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.021370 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-catalog-content\") pod \"community-operators-zvvpt\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.021430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-utilities\") pod \"community-operators-zvvpt\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.026915 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073e807a-4708-4b50-abf6-f66668e13e8e-kube-api-access-bm2fl" (OuterVolumeSpecName: "kube-api-access-bm2fl") pod "073e807a-4708-4b50-abf6-f66668e13e8e" (UID: "073e807a-4708-4b50-abf6-f66668e13e8e"). InnerVolumeSpecName "kube-api-access-bm2fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.027010 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ceph" (OuterVolumeSpecName: "ceph") pod "073e807a-4708-4b50-abf6-f66668e13e8e" (UID: "073e807a-4708-4b50-abf6-f66668e13e8e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.055978 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-inventory" (OuterVolumeSpecName: "inventory") pod "073e807a-4708-4b50-abf6-f66668e13e8e" (UID: "073e807a-4708-4b50-abf6-f66668e13e8e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.079276 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "073e807a-4708-4b50-abf6-f66668e13e8e" (UID: "073e807a-4708-4b50-abf6-f66668e13e8e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.122762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-catalog-content\") pod \"community-operators-zvvpt\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.122831 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-utilities\") pod \"community-operators-zvvpt\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.122992 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4s82\" (UniqueName: \"kubernetes.io/projected/21e6825f-ed92-4bac-9093-d551fa6d17e2-kube-api-access-l4s82\") pod \"community-operators-zvvpt\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.123288 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-catalog-content\") pod \"community-operators-zvvpt\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.123561 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.123994 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm2fl\" (UniqueName: \"kubernetes.io/projected/073e807a-4708-4b50-abf6-f66668e13e8e-kube-api-access-bm2fl\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.124044 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.124062 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/073e807a-4708-4b50-abf6-f66668e13e8e-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.124122 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-utilities\") pod \"community-operators-zvvpt\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.142584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4s82\" (UniqueName: \"kubernetes.io/projected/21e6825f-ed92-4bac-9093-d551fa6d17e2-kube-api-access-l4s82\") pod \"community-operators-zvvpt\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.293447 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.470099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" event={"ID":"073e807a-4708-4b50-abf6-f66668e13e8e","Type":"ContainerDied","Data":"4e91a6d0e617a8b65862594e2dfb36d28ff686f46c58c72cf060023df69aa870"} Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.470231 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e91a6d0e617a8b65862594e2dfb36d28ff686f46c58c72cf060023df69aa870" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.470197 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ch565" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.628290 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk"] Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.629577 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.634646 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.634824 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.635075 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.635206 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.635328 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.635677 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.635942 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.648382 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.681754 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk"] Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.733766 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.733852 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc8q8\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-kube-api-access-rc8q8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.733893 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.733922 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.733947 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.733964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.733987 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.734008 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.734033 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.734059 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.734116 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.734159 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.734189 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835339 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835395 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835423 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835477 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835513 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835587 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc8q8\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-kube-api-access-rc8q8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835617 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835638 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835654 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835669 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.835691 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.843199 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.843618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.843804 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.844537 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.844689 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.846382 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.847415 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.851936 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.852057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.852449 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.853329 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.858586 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.859130 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc8q8\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-kube-api-access-rc8q8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.939070 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zvvpt"] Oct 12 21:04:39 crc kubenswrapper[4773]: I1012 21:04:39.983870 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:04:40 crc kubenswrapper[4773]: I1012 21:04:40.496150 4773 generic.go:334] "Generic (PLEG): container finished" podID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerID="b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194" exitCode=0 Oct 12 21:04:40 crc kubenswrapper[4773]: I1012 21:04:40.522190 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvvpt" event={"ID":"21e6825f-ed92-4bac-9093-d551fa6d17e2","Type":"ContainerDied","Data":"b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194"} Oct 12 21:04:40 crc kubenswrapper[4773]: I1012 21:04:40.522237 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvvpt" event={"ID":"21e6825f-ed92-4bac-9093-d551fa6d17e2","Type":"ContainerStarted","Data":"7182fb721e39b91b3db5f369147175b50e81c4dc0d865d8bcfa43df72f3dac48"} Oct 12 21:04:40 crc kubenswrapper[4773]: I1012 21:04:40.539559 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk"] Oct 12 21:04:40 crc kubenswrapper[4773]: W1012 21:04:40.556571 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23eb0d3e_06b9_4b1e_b493_27d00d4f34f4.slice/crio-e01499f0cb1faf60a06d404c90fac11eae63c2ac6472d9b835273567a359f763 WatchSource:0}: Error finding container e01499f0cb1faf60a06d404c90fac11eae63c2ac6472d9b835273567a359f763: Status 404 returned error can't find the container with id e01499f0cb1faf60a06d404c90fac11eae63c2ac6472d9b835273567a359f763 Oct 12 21:04:41 crc kubenswrapper[4773]: I1012 21:04:41.504962 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvvpt" event={"ID":"21e6825f-ed92-4bac-9093-d551fa6d17e2","Type":"ContainerStarted","Data":"efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570"} Oct 12 21:04:41 crc kubenswrapper[4773]: I1012 21:04:41.509766 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" event={"ID":"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4","Type":"ContainerStarted","Data":"c464ef52d24b218aa2ecc1bb2f58957f326febd79aff0db64071e9d14c4b6e74"} Oct 12 21:04:41 crc kubenswrapper[4773]: I1012 21:04:41.509903 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" event={"ID":"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4","Type":"ContainerStarted","Data":"e01499f0cb1faf60a06d404c90fac11eae63c2ac6472d9b835273567a359f763"} Oct 12 21:04:41 crc kubenswrapper[4773]: I1012 21:04:41.540633 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" podStartSLOduration=2.046247275 podStartE2EDuration="2.540618238s" podCreationTimestamp="2025-10-12 21:04:39 +0000 UTC" firstStartedPulling="2025-10-12 21:04:40.559962081 +0000 UTC m=+2428.796260641" lastFinishedPulling="2025-10-12 21:04:41.054333044 +0000 UTC m=+2429.290631604" observedRunningTime="2025-10-12 21:04:41.539152679 +0000 UTC m=+2429.775451249" watchObservedRunningTime="2025-10-12 21:04:41.540618238 +0000 UTC m=+2429.776916798" Oct 12 21:04:43 crc kubenswrapper[4773]: I1012 21:04:43.527607 4773 generic.go:334] "Generic (PLEG): container finished" podID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerID="efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570" exitCode=0 Oct 12 21:04:43 crc kubenswrapper[4773]: I1012 21:04:43.527899 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvvpt" event={"ID":"21e6825f-ed92-4bac-9093-d551fa6d17e2","Type":"ContainerDied","Data":"efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570"} Oct 12 21:04:44 crc kubenswrapper[4773]: I1012 21:04:44.536408 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvvpt" event={"ID":"21e6825f-ed92-4bac-9093-d551fa6d17e2","Type":"ContainerStarted","Data":"53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef"} Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.228132 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dktnk" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="registry-server" probeResult="failure" output=< Oct 12 21:04:46 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:04:46 crc kubenswrapper[4773]: > Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.344949 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zvvpt" podStartSLOduration=4.877490145 podStartE2EDuration="8.344931319s" podCreationTimestamp="2025-10-12 21:04:38 +0000 UTC" firstStartedPulling="2025-10-12 21:04:40.503118035 +0000 UTC m=+2428.739416605" lastFinishedPulling="2025-10-12 21:04:43.970559219 +0000 UTC m=+2432.206857779" observedRunningTime="2025-10-12 21:04:44.565156699 +0000 UTC m=+2432.801455249" watchObservedRunningTime="2025-10-12 21:04:46.344931319 +0000 UTC m=+2434.581229879" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.345488 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t9vpl"] Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.347376 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.364858 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t9vpl"] Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.373415 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-utilities\") pod \"certified-operators-t9vpl\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.373697 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-catalog-content\") pod \"certified-operators-t9vpl\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.373819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvndh\" (UniqueName: \"kubernetes.io/projected/b19adba1-d66e-4e68-a41c-b805b9a467e1-kube-api-access-lvndh\") pod \"certified-operators-t9vpl\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.476948 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-utilities\") pod \"certified-operators-t9vpl\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.477025 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-catalog-content\") pod \"certified-operators-t9vpl\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.477086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvndh\" (UniqueName: \"kubernetes.io/projected/b19adba1-d66e-4e68-a41c-b805b9a467e1-kube-api-access-lvndh\") pod \"certified-operators-t9vpl\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.477670 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-utilities\") pod \"certified-operators-t9vpl\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.477670 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-catalog-content\") pod \"certified-operators-t9vpl\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.502872 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvndh\" (UniqueName: \"kubernetes.io/projected/b19adba1-d66e-4e68-a41c-b805b9a467e1-kube-api-access-lvndh\") pod \"certified-operators-t9vpl\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:46 crc kubenswrapper[4773]: I1012 21:04:46.679860 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:47 crc kubenswrapper[4773]: I1012 21:04:47.231694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t9vpl"] Oct 12 21:04:47 crc kubenswrapper[4773]: I1012 21:04:47.565204 4773 generic.go:334] "Generic (PLEG): container finished" podID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerID="4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf" exitCode=0 Oct 12 21:04:47 crc kubenswrapper[4773]: I1012 21:04:47.565289 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9vpl" event={"ID":"b19adba1-d66e-4e68-a41c-b805b9a467e1","Type":"ContainerDied","Data":"4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf"} Oct 12 21:04:47 crc kubenswrapper[4773]: I1012 21:04:47.565549 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9vpl" event={"ID":"b19adba1-d66e-4e68-a41c-b805b9a467e1","Type":"ContainerStarted","Data":"e7e2ebdf36d369cdeabeb2f7b402ecb2c7457054bc04417f31a01a4d8fb47e11"} Oct 12 21:04:48 crc kubenswrapper[4773]: I1012 21:04:48.577283 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9vpl" event={"ID":"b19adba1-d66e-4e68-a41c-b805b9a467e1","Type":"ContainerStarted","Data":"c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b"} Oct 12 21:04:49 crc kubenswrapper[4773]: I1012 21:04:49.294726 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:49 crc kubenswrapper[4773]: I1012 21:04:49.295455 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:04:49 crc kubenswrapper[4773]: I1012 21:04:49.589667 4773 generic.go:334] "Generic (PLEG): container finished" podID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerID="c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b" exitCode=0 Oct 12 21:04:49 crc kubenswrapper[4773]: I1012 21:04:49.589784 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9vpl" event={"ID":"b19adba1-d66e-4e68-a41c-b805b9a467e1","Type":"ContainerDied","Data":"c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b"} Oct 12 21:04:50 crc kubenswrapper[4773]: I1012 21:04:50.344093 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zvvpt" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="registry-server" probeResult="failure" output=< Oct 12 21:04:50 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:04:50 crc kubenswrapper[4773]: > Oct 12 21:04:50 crc kubenswrapper[4773]: I1012 21:04:50.603102 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9vpl" event={"ID":"b19adba1-d66e-4e68-a41c-b805b9a467e1","Type":"ContainerStarted","Data":"da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674"} Oct 12 21:04:50 crc kubenswrapper[4773]: I1012 21:04:50.628984 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t9vpl" podStartSLOduration=2.183368455 podStartE2EDuration="4.628966829s" podCreationTimestamp="2025-10-12 21:04:46 +0000 UTC" firstStartedPulling="2025-10-12 21:04:47.566587606 +0000 UTC m=+2435.802886166" lastFinishedPulling="2025-10-12 21:04:50.01218596 +0000 UTC m=+2438.248484540" observedRunningTime="2025-10-12 21:04:50.62531972 +0000 UTC m=+2438.861618280" watchObservedRunningTime="2025-10-12 21:04:50.628966829 +0000 UTC m=+2438.865265389" Oct 12 21:04:51 crc kubenswrapper[4773]: I1012 21:04:51.481955 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:04:51 crc kubenswrapper[4773]: E1012 21:04:51.482521 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:04:56 crc kubenswrapper[4773]: I1012 21:04:56.251200 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dktnk" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="registry-server" probeResult="failure" output=< Oct 12 21:04:56 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:04:56 crc kubenswrapper[4773]: > Oct 12 21:04:56 crc kubenswrapper[4773]: I1012 21:04:56.680954 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:56 crc kubenswrapper[4773]: I1012 21:04:56.680989 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:56 crc kubenswrapper[4773]: I1012 21:04:56.746503 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:57 crc kubenswrapper[4773]: I1012 21:04:57.724018 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:04:57 crc kubenswrapper[4773]: I1012 21:04:57.793066 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t9vpl"] Oct 12 21:04:59 crc kubenswrapper[4773]: I1012 21:04:59.678296 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t9vpl" podUID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerName="registry-server" containerID="cri-o://da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674" gracePeriod=2 Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.196080 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.337298 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zvvpt" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="registry-server" probeResult="failure" output=< Oct 12 21:05:00 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:05:00 crc kubenswrapper[4773]: > Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.350752 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-utilities\") pod \"b19adba1-d66e-4e68-a41c-b805b9a467e1\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.350821 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-catalog-content\") pod \"b19adba1-d66e-4e68-a41c-b805b9a467e1\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.350865 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvndh\" (UniqueName: \"kubernetes.io/projected/b19adba1-d66e-4e68-a41c-b805b9a467e1-kube-api-access-lvndh\") pod \"b19adba1-d66e-4e68-a41c-b805b9a467e1\" (UID: \"b19adba1-d66e-4e68-a41c-b805b9a467e1\") " Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.351465 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-utilities" (OuterVolumeSpecName: "utilities") pod "b19adba1-d66e-4e68-a41c-b805b9a467e1" (UID: "b19adba1-d66e-4e68-a41c-b805b9a467e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.356216 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19adba1-d66e-4e68-a41c-b805b9a467e1-kube-api-access-lvndh" (OuterVolumeSpecName: "kube-api-access-lvndh") pod "b19adba1-d66e-4e68-a41c-b805b9a467e1" (UID: "b19adba1-d66e-4e68-a41c-b805b9a467e1"). InnerVolumeSpecName "kube-api-access-lvndh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.400234 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b19adba1-d66e-4e68-a41c-b805b9a467e1" (UID: "b19adba1-d66e-4e68-a41c-b805b9a467e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.453139 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.453171 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19adba1-d66e-4e68-a41c-b805b9a467e1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.453184 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvndh\" (UniqueName: \"kubernetes.io/projected/b19adba1-d66e-4e68-a41c-b805b9a467e1-kube-api-access-lvndh\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.688422 4773 generic.go:334] "Generic (PLEG): container finished" podID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerID="da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674" exitCode=0 Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.688465 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9vpl" event={"ID":"b19adba1-d66e-4e68-a41c-b805b9a467e1","Type":"ContainerDied","Data":"da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674"} Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.688496 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9vpl" event={"ID":"b19adba1-d66e-4e68-a41c-b805b9a467e1","Type":"ContainerDied","Data":"e7e2ebdf36d369cdeabeb2f7b402ecb2c7457054bc04417f31a01a4d8fb47e11"} Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.688513 4773 scope.go:117] "RemoveContainer" containerID="da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.688586 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9vpl" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.709146 4773 scope.go:117] "RemoveContainer" containerID="c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.710325 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t9vpl"] Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.721822 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t9vpl"] Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.731706 4773 scope.go:117] "RemoveContainer" containerID="4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.790676 4773 scope.go:117] "RemoveContainer" containerID="da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674" Oct 12 21:05:00 crc kubenswrapper[4773]: E1012 21:05:00.791329 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674\": container with ID starting with da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674 not found: ID does not exist" containerID="da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.791393 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674"} err="failed to get container status \"da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674\": rpc error: code = NotFound desc = could not find container \"da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674\": container with ID starting with da39d42543f8f0e5ef550ee7d3415cd0f1f38b3cdd79455f2a1240d9d3f73674 not found: ID does not exist" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.791424 4773 scope.go:117] "RemoveContainer" containerID="c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b" Oct 12 21:05:00 crc kubenswrapper[4773]: E1012 21:05:00.791789 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b\": container with ID starting with c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b not found: ID does not exist" containerID="c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.791814 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b"} err="failed to get container status \"c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b\": rpc error: code = NotFound desc = could not find container \"c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b\": container with ID starting with c3491a6ae36e423f21f57543e7b2914b49d47d6572c28099ccc0139ffddb971b not found: ID does not exist" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.791833 4773 scope.go:117] "RemoveContainer" containerID="4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf" Oct 12 21:05:00 crc kubenswrapper[4773]: E1012 21:05:00.792108 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf\": container with ID starting with 4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf not found: ID does not exist" containerID="4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf" Oct 12 21:05:00 crc kubenswrapper[4773]: I1012 21:05:00.792133 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf"} err="failed to get container status \"4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf\": rpc error: code = NotFound desc = could not find container \"4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf\": container with ID starting with 4d1ba85fd6d78915446988f78208ec60e5e9fff0ee63ec6fe8710d363b6b6acf not found: ID does not exist" Oct 12 21:05:02 crc kubenswrapper[4773]: I1012 21:05:02.494895 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19adba1-d66e-4e68-a41c-b805b9a467e1" path="/var/lib/kubelet/pods/b19adba1-d66e-4e68-a41c-b805b9a467e1/volumes" Oct 12 21:05:03 crc kubenswrapper[4773]: I1012 21:05:03.480746 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:05:03 crc kubenswrapper[4773]: E1012 21:05:03.481311 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:05:05 crc kubenswrapper[4773]: I1012 21:05:05.230541 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:05:05 crc kubenswrapper[4773]: I1012 21:05:05.285015 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:05:05 crc kubenswrapper[4773]: I1012 21:05:05.464927 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dktnk"] Oct 12 21:05:06 crc kubenswrapper[4773]: I1012 21:05:06.744102 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dktnk" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="registry-server" containerID="cri-o://d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca" gracePeriod=2 Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.232359 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.303702 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnv94\" (UniqueName: \"kubernetes.io/projected/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-kube-api-access-bnv94\") pod \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.303830 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-catalog-content\") pod \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.304898 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-utilities\") pod \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\" (UID: \"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4\") " Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.307206 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-utilities" (OuterVolumeSpecName: "utilities") pod "74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" (UID: "74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.310149 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-kube-api-access-bnv94" (OuterVolumeSpecName: "kube-api-access-bnv94") pod "74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" (UID: "74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4"). InnerVolumeSpecName "kube-api-access-bnv94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.382880 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" (UID: "74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.407149 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.407181 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnv94\" (UniqueName: \"kubernetes.io/projected/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-kube-api-access-bnv94\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.407193 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.755052 4773 generic.go:334] "Generic (PLEG): container finished" podID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerID="d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca" exitCode=0 Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.755102 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dktnk" event={"ID":"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4","Type":"ContainerDied","Data":"d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca"} Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.755119 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dktnk" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.755138 4773 scope.go:117] "RemoveContainer" containerID="d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.755128 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dktnk" event={"ID":"74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4","Type":"ContainerDied","Data":"cdd994838be1601258d27084da31aff089d789bc5a1c59ddc312288c41d65c29"} Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.794022 4773 scope.go:117] "RemoveContainer" containerID="03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.800734 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dktnk"] Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.809230 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dktnk"] Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.821219 4773 scope.go:117] "RemoveContainer" containerID="50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.877393 4773 scope.go:117] "RemoveContainer" containerID="d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca" Oct 12 21:05:07 crc kubenswrapper[4773]: E1012 21:05:07.877945 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca\": container with ID starting with d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca not found: ID does not exist" containerID="d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.877978 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca"} err="failed to get container status \"d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca\": rpc error: code = NotFound desc = could not find container \"d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca\": container with ID starting with d5015ff40846547d9c793ee8be44daf74ae77f71d2c3d26f10d1469228e548ca not found: ID does not exist" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.878015 4773 scope.go:117] "RemoveContainer" containerID="03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f" Oct 12 21:05:07 crc kubenswrapper[4773]: E1012 21:05:07.878401 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f\": container with ID starting with 03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f not found: ID does not exist" containerID="03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.878440 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f"} err="failed to get container status \"03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f\": rpc error: code = NotFound desc = could not find container \"03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f\": container with ID starting with 03c9a196007ab1539d4b9b0159f57bc1e24588a499ba2897762042396b69d48f not found: ID does not exist" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.878452 4773 scope.go:117] "RemoveContainer" containerID="50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96" Oct 12 21:05:07 crc kubenswrapper[4773]: E1012 21:05:07.878790 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96\": container with ID starting with 50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96 not found: ID does not exist" containerID="50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96" Oct 12 21:05:07 crc kubenswrapper[4773]: I1012 21:05:07.878833 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96"} err="failed to get container status \"50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96\": rpc error: code = NotFound desc = could not find container \"50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96\": container with ID starting with 50abb67b1d2f7938e5f48dd2f8ce05f336b23dddab90223e1a4ca6a1d0ddec96 not found: ID does not exist" Oct 12 21:05:08 crc kubenswrapper[4773]: I1012 21:05:08.491073 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" path="/var/lib/kubelet/pods/74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4/volumes" Oct 12 21:05:09 crc kubenswrapper[4773]: I1012 21:05:09.351173 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:05:09 crc kubenswrapper[4773]: I1012 21:05:09.411586 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:05:10 crc kubenswrapper[4773]: I1012 21:05:10.865930 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zvvpt"] Oct 12 21:05:10 crc kubenswrapper[4773]: I1012 21:05:10.866507 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zvvpt" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="registry-server" containerID="cri-o://53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef" gracePeriod=2 Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.308345 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.379115 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-catalog-content\") pod \"21e6825f-ed92-4bac-9093-d551fa6d17e2\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.379163 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4s82\" (UniqueName: \"kubernetes.io/projected/21e6825f-ed92-4bac-9093-d551fa6d17e2-kube-api-access-l4s82\") pod \"21e6825f-ed92-4bac-9093-d551fa6d17e2\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.379236 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-utilities\") pod \"21e6825f-ed92-4bac-9093-d551fa6d17e2\" (UID: \"21e6825f-ed92-4bac-9093-d551fa6d17e2\") " Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.379929 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-utilities" (OuterVolumeSpecName: "utilities") pod "21e6825f-ed92-4bac-9093-d551fa6d17e2" (UID: "21e6825f-ed92-4bac-9093-d551fa6d17e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.380276 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.396352 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e6825f-ed92-4bac-9093-d551fa6d17e2-kube-api-access-l4s82" (OuterVolumeSpecName: "kube-api-access-l4s82") pod "21e6825f-ed92-4bac-9093-d551fa6d17e2" (UID: "21e6825f-ed92-4bac-9093-d551fa6d17e2"). InnerVolumeSpecName "kube-api-access-l4s82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.434328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21e6825f-ed92-4bac-9093-d551fa6d17e2" (UID: "21e6825f-ed92-4bac-9093-d551fa6d17e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.481398 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21e6825f-ed92-4bac-9093-d551fa6d17e2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.481427 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4s82\" (UniqueName: \"kubernetes.io/projected/21e6825f-ed92-4bac-9093-d551fa6d17e2-kube-api-access-l4s82\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.793476 4773 generic.go:334] "Generic (PLEG): container finished" podID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerID="53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef" exitCode=0 Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.793520 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvvpt" event={"ID":"21e6825f-ed92-4bac-9093-d551fa6d17e2","Type":"ContainerDied","Data":"53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef"} Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.793548 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvvpt" event={"ID":"21e6825f-ed92-4bac-9093-d551fa6d17e2","Type":"ContainerDied","Data":"7182fb721e39b91b3db5f369147175b50e81c4dc0d865d8bcfa43df72f3dac48"} Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.793568 4773 scope.go:117] "RemoveContainer" containerID="53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.793637 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvvpt" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.835007 4773 scope.go:117] "RemoveContainer" containerID="efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.850762 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zvvpt"] Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.869896 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zvvpt"] Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.872041 4773 scope.go:117] "RemoveContainer" containerID="b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.907204 4773 scope.go:117] "RemoveContainer" containerID="53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef" Oct 12 21:05:11 crc kubenswrapper[4773]: E1012 21:05:11.907691 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef\": container with ID starting with 53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef not found: ID does not exist" containerID="53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.907747 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef"} err="failed to get container status \"53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef\": rpc error: code = NotFound desc = could not find container \"53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef\": container with ID starting with 53e73fde3c7d19802db43a28cfc8a635a2fc10fe9f0449e649f4352ce244deef not found: ID does not exist" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.907771 4773 scope.go:117] "RemoveContainer" containerID="efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570" Oct 12 21:05:11 crc kubenswrapper[4773]: E1012 21:05:11.908117 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570\": container with ID starting with efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570 not found: ID does not exist" containerID="efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.908148 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570"} err="failed to get container status \"efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570\": rpc error: code = NotFound desc = could not find container \"efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570\": container with ID starting with efefbced16b3cdc48a56081c654cbbcac433e382a185d990951d8f012d046570 not found: ID does not exist" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.908165 4773 scope.go:117] "RemoveContainer" containerID="b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194" Oct 12 21:05:11 crc kubenswrapper[4773]: E1012 21:05:11.908472 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194\": container with ID starting with b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194 not found: ID does not exist" containerID="b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194" Oct 12 21:05:11 crc kubenswrapper[4773]: I1012 21:05:11.908499 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194"} err="failed to get container status \"b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194\": rpc error: code = NotFound desc = could not find container \"b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194\": container with ID starting with b10a01cd7792435f6e487c35d0ebdab7bc8d4c6b21eb10f8cc331667a7311194 not found: ID does not exist" Oct 12 21:05:12 crc kubenswrapper[4773]: I1012 21:05:12.501529 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" path="/var/lib/kubelet/pods/21e6825f-ed92-4bac-9093-d551fa6d17e2/volumes" Oct 12 21:05:18 crc kubenswrapper[4773]: I1012 21:05:18.481508 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:05:18 crc kubenswrapper[4773]: E1012 21:05:18.482308 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:05:20 crc kubenswrapper[4773]: I1012 21:05:20.907396 4773 generic.go:334] "Generic (PLEG): container finished" podID="23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" containerID="c464ef52d24b218aa2ecc1bb2f58957f326febd79aff0db64071e9d14c4b6e74" exitCode=0 Oct 12 21:05:20 crc kubenswrapper[4773]: I1012 21:05:20.907509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" event={"ID":"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4","Type":"ContainerDied","Data":"c464ef52d24b218aa2ecc1bb2f58957f326febd79aff0db64071e9d14c4b6e74"} Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.387562 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.509654 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-nova-combined-ca-bundle\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.509705 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.509771 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-repo-setup-combined-ca-bundle\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.509793 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.509838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-bootstrap-combined-ca-bundle\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.509882 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-inventory\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.509911 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ceph\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.509941 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ssh-key\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.509970 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-libvirt-combined-ca-bundle\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.510007 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ovn-combined-ca-bundle\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.510038 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc8q8\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-kube-api-access-rc8q8\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.510078 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-neutron-metadata-combined-ca-bundle\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.510125 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\" (UID: \"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4\") " Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.515490 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.515759 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.515968 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ceph" (OuterVolumeSpecName: "ceph") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.517421 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-kube-api-access-rc8q8" (OuterVolumeSpecName: "kube-api-access-rc8q8") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "kube-api-access-rc8q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.517621 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.518255 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.519805 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.520258 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.520647 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.521950 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.523029 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.547949 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-inventory" (OuterVolumeSpecName: "inventory") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.558852 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" (UID: "23eb0d3e-06b9-4b1e-b493-27d00d4f34f4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612325 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612350 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc8q8\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-kube-api-access-rc8q8\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612359 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612370 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612379 4773 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612388 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612399 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612407 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612416 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612426 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612434 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612441 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.612448 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eb0d3e-06b9-4b1e-b493-27d00d4f34f4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.935168 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" event={"ID":"23eb0d3e-06b9-4b1e-b493-27d00d4f34f4","Type":"ContainerDied","Data":"e01499f0cb1faf60a06d404c90fac11eae63c2ac6472d9b835273567a359f763"} Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.935251 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e01499f0cb1faf60a06d404c90fac11eae63c2ac6472d9b835273567a359f763" Oct 12 21:05:22 crc kubenswrapper[4773]: I1012 21:05:22.935249 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.077093 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd"] Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.077612 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerName="extract-utilities" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.077634 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerName="extract-utilities" Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.077664 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="registry-server" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.077676 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="registry-server" Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.077710 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerName="extract-content" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.077753 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerName="extract-content" Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.077809 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerName="registry-server" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.077822 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerName="registry-server" Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.077845 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="registry-server" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.077856 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="registry-server" Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.077875 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="extract-content" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.077887 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="extract-content" Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.077925 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="extract-content" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.077936 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="extract-content" Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.077949 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.077963 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.077991 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="extract-utilities" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.078003 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="extract-utilities" Oct 12 21:05:23 crc kubenswrapper[4773]: E1012 21:05:23.078020 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="extract-utilities" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.078035 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="extract-utilities" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.078346 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19adba1-d66e-4e68-a41c-b805b9a467e1" containerName="registry-server" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.078365 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e6825f-ed92-4bac-9093-d551fa6d17e2" containerName="registry-server" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.078391 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="23eb0d3e-06b9-4b1e-b493-27d00d4f34f4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.078419 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e4ca00-1bf5-42e2-b14a-2dc8328f3dc4" containerName="registry-server" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.079352 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.091262 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.092013 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.092361 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.097431 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.098312 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd"] Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.098552 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.224976 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.225056 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hv8r\" (UniqueName: \"kubernetes.io/projected/f7d6457c-5706-4a38-b0ef-24cc906b7cab-kube-api-access-5hv8r\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.225168 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.225524 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.327900 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hv8r\" (UniqueName: \"kubernetes.io/projected/f7d6457c-5706-4a38-b0ef-24cc906b7cab-kube-api-access-5hv8r\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.328020 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.328156 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.328188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.332855 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.334840 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.336286 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.356165 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hv8r\" (UniqueName: \"kubernetes.io/projected/f7d6457c-5706-4a38-b0ef-24cc906b7cab-kube-api-access-5hv8r\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.401139 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:23 crc kubenswrapper[4773]: I1012 21:05:23.948196 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd"] Oct 12 21:05:24 crc kubenswrapper[4773]: I1012 21:05:24.954614 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" event={"ID":"f7d6457c-5706-4a38-b0ef-24cc906b7cab","Type":"ContainerStarted","Data":"0fc2c8e7bdef26a24a1330d2c38ba975296ad8ae03a9a07d09174068064cfb50"} Oct 12 21:05:24 crc kubenswrapper[4773]: I1012 21:05:24.956141 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" event={"ID":"f7d6457c-5706-4a38-b0ef-24cc906b7cab","Type":"ContainerStarted","Data":"75237bb7ef291952a58929d4ae21b3f0515919577068782b86e47006c95bb6a9"} Oct 12 21:05:24 crc kubenswrapper[4773]: I1012 21:05:24.978497 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" podStartSLOduration=1.460448397 podStartE2EDuration="1.978480439s" podCreationTimestamp="2025-10-12 21:05:23 +0000 UTC" firstStartedPulling="2025-10-12 21:05:23.967859044 +0000 UTC m=+2472.204157624" lastFinishedPulling="2025-10-12 21:05:24.485891106 +0000 UTC m=+2472.722189666" observedRunningTime="2025-10-12 21:05:24.974275426 +0000 UTC m=+2473.210573996" watchObservedRunningTime="2025-10-12 21:05:24.978480439 +0000 UTC m=+2473.214779009" Oct 12 21:05:30 crc kubenswrapper[4773]: I1012 21:05:30.481855 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:05:30 crc kubenswrapper[4773]: E1012 21:05:30.483848 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:05:31 crc kubenswrapper[4773]: I1012 21:05:31.023131 4773 generic.go:334] "Generic (PLEG): container finished" podID="f7d6457c-5706-4a38-b0ef-24cc906b7cab" containerID="0fc2c8e7bdef26a24a1330d2c38ba975296ad8ae03a9a07d09174068064cfb50" exitCode=0 Oct 12 21:05:31 crc kubenswrapper[4773]: I1012 21:05:31.023178 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" event={"ID":"f7d6457c-5706-4a38-b0ef-24cc906b7cab","Type":"ContainerDied","Data":"0fc2c8e7bdef26a24a1330d2c38ba975296ad8ae03a9a07d09174068064cfb50"} Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.493780 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.612530 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-inventory\") pod \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.612591 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ssh-key\") pod \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.612689 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hv8r\" (UniqueName: \"kubernetes.io/projected/f7d6457c-5706-4a38-b0ef-24cc906b7cab-kube-api-access-5hv8r\") pod \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.612724 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ceph\") pod \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\" (UID: \"f7d6457c-5706-4a38-b0ef-24cc906b7cab\") " Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.619588 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d6457c-5706-4a38-b0ef-24cc906b7cab-kube-api-access-5hv8r" (OuterVolumeSpecName: "kube-api-access-5hv8r") pod "f7d6457c-5706-4a38-b0ef-24cc906b7cab" (UID: "f7d6457c-5706-4a38-b0ef-24cc906b7cab"). InnerVolumeSpecName "kube-api-access-5hv8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.619732 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ceph" (OuterVolumeSpecName: "ceph") pod "f7d6457c-5706-4a38-b0ef-24cc906b7cab" (UID: "f7d6457c-5706-4a38-b0ef-24cc906b7cab"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.640190 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-inventory" (OuterVolumeSpecName: "inventory") pod "f7d6457c-5706-4a38-b0ef-24cc906b7cab" (UID: "f7d6457c-5706-4a38-b0ef-24cc906b7cab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.644640 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f7d6457c-5706-4a38-b0ef-24cc906b7cab" (UID: "f7d6457c-5706-4a38-b0ef-24cc906b7cab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.715475 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.715503 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.715512 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hv8r\" (UniqueName: \"kubernetes.io/projected/f7d6457c-5706-4a38-b0ef-24cc906b7cab-kube-api-access-5hv8r\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:32 crc kubenswrapper[4773]: I1012 21:05:32.715521 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f7d6457c-5706-4a38-b0ef-24cc906b7cab-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.041318 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" event={"ID":"f7d6457c-5706-4a38-b0ef-24cc906b7cab","Type":"ContainerDied","Data":"75237bb7ef291952a58929d4ae21b3f0515919577068782b86e47006c95bb6a9"} Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.041354 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75237bb7ef291952a58929d4ae21b3f0515919577068782b86e47006c95bb6a9" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.041427 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.125336 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd"] Oct 12 21:05:33 crc kubenswrapper[4773]: E1012 21:05:33.125654 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d6457c-5706-4a38-b0ef-24cc906b7cab" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.125671 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d6457c-5706-4a38-b0ef-24cc906b7cab" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.125850 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d6457c-5706-4a38-b0ef-24cc906b7cab" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.126413 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.131556 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.131868 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.132163 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.132261 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.132375 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.132586 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.144309 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd"] Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.225074 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2clw4\" (UniqueName: \"kubernetes.io/projected/62f86eec-8f45-4449-a363-cb195f58abbd-kube-api-access-2clw4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.225357 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/62f86eec-8f45-4449-a363-cb195f58abbd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.225389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.225413 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.225444 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.225587 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.328153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2clw4\" (UniqueName: \"kubernetes.io/projected/62f86eec-8f45-4449-a363-cb195f58abbd-kube-api-access-2clw4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.328224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/62f86eec-8f45-4449-a363-cb195f58abbd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.328282 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.328330 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.329242 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/62f86eec-8f45-4449-a363-cb195f58abbd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.329593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.330046 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.333083 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.334344 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.336552 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.346336 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.346591 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2clw4\" (UniqueName: \"kubernetes.io/projected/62f86eec-8f45-4449-a363-cb195f58abbd-kube-api-access-2clw4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dlkwd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.441817 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:05:33 crc kubenswrapper[4773]: I1012 21:05:33.992022 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd"] Oct 12 21:05:34 crc kubenswrapper[4773]: I1012 21:05:34.053357 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" event={"ID":"62f86eec-8f45-4449-a363-cb195f58abbd","Type":"ContainerStarted","Data":"1c0a9a461edb8f9154543bc69a62f3e2a4a9b09d915b3b232e2c0b791d3cc887"} Oct 12 21:05:35 crc kubenswrapper[4773]: I1012 21:05:35.070444 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" event={"ID":"62f86eec-8f45-4449-a363-cb195f58abbd","Type":"ContainerStarted","Data":"2363b98c3dfbbe871d6920d99a589fae27fd57928f0d1db054c5a181a1d801be"} Oct 12 21:05:35 crc kubenswrapper[4773]: I1012 21:05:35.102180 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" podStartSLOduration=1.6779983779999998 podStartE2EDuration="2.102156654s" podCreationTimestamp="2025-10-12 21:05:33 +0000 UTC" firstStartedPulling="2025-10-12 21:05:34.014173969 +0000 UTC m=+2482.250472529" lastFinishedPulling="2025-10-12 21:05:34.438332235 +0000 UTC m=+2482.674630805" observedRunningTime="2025-10-12 21:05:35.100351556 +0000 UTC m=+2483.336650156" watchObservedRunningTime="2025-10-12 21:05:35.102156654 +0000 UTC m=+2483.338455264" Oct 12 21:05:41 crc kubenswrapper[4773]: I1012 21:05:41.480895 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:05:41 crc kubenswrapper[4773]: E1012 21:05:41.481796 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:05:52 crc kubenswrapper[4773]: I1012 21:05:52.492433 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:05:52 crc kubenswrapper[4773]: E1012 21:05:52.494006 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:06:04 crc kubenswrapper[4773]: I1012 21:06:04.504521 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:06:04 crc kubenswrapper[4773]: E1012 21:06:04.505489 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:06:17 crc kubenswrapper[4773]: I1012 21:06:17.481884 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:06:17 crc kubenswrapper[4773]: E1012 21:06:17.483173 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:06:28 crc kubenswrapper[4773]: I1012 21:06:28.481481 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:06:28 crc kubenswrapper[4773]: E1012 21:06:28.482140 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:06:42 crc kubenswrapper[4773]: I1012 21:06:42.486444 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:06:42 crc kubenswrapper[4773]: I1012 21:06:42.727742 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"6e9d2a2e10726cb6bc3146bbd48dd0b1abfe6c4848fd3baeeeb35db6671eae8e"} Oct 12 21:07:01 crc kubenswrapper[4773]: I1012 21:07:01.927002 4773 generic.go:334] "Generic (PLEG): container finished" podID="62f86eec-8f45-4449-a363-cb195f58abbd" containerID="2363b98c3dfbbe871d6920d99a589fae27fd57928f0d1db054c5a181a1d801be" exitCode=0 Oct 12 21:07:01 crc kubenswrapper[4773]: I1012 21:07:01.927096 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" event={"ID":"62f86eec-8f45-4449-a363-cb195f58abbd","Type":"ContainerDied","Data":"2363b98c3dfbbe871d6920d99a589fae27fd57928f0d1db054c5a181a1d801be"} Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.440269 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.503489 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/62f86eec-8f45-4449-a363-cb195f58abbd-ovncontroller-config-0\") pod \"62f86eec-8f45-4449-a363-cb195f58abbd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.531925 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f86eec-8f45-4449-a363-cb195f58abbd-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "62f86eec-8f45-4449-a363-cb195f58abbd" (UID: "62f86eec-8f45-4449-a363-cb195f58abbd"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.604923 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ceph\") pod \"62f86eec-8f45-4449-a363-cb195f58abbd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.604964 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ovn-combined-ca-bundle\") pod \"62f86eec-8f45-4449-a363-cb195f58abbd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.604990 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-inventory\") pod \"62f86eec-8f45-4449-a363-cb195f58abbd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.605028 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ssh-key\") pod \"62f86eec-8f45-4449-a363-cb195f58abbd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.605086 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2clw4\" (UniqueName: \"kubernetes.io/projected/62f86eec-8f45-4449-a363-cb195f58abbd-kube-api-access-2clw4\") pod \"62f86eec-8f45-4449-a363-cb195f58abbd\" (UID: \"62f86eec-8f45-4449-a363-cb195f58abbd\") " Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.605354 4773 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/62f86eec-8f45-4449-a363-cb195f58abbd-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.607627 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f86eec-8f45-4449-a363-cb195f58abbd-kube-api-access-2clw4" (OuterVolumeSpecName: "kube-api-access-2clw4") pod "62f86eec-8f45-4449-a363-cb195f58abbd" (UID: "62f86eec-8f45-4449-a363-cb195f58abbd"). InnerVolumeSpecName "kube-api-access-2clw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.611611 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "62f86eec-8f45-4449-a363-cb195f58abbd" (UID: "62f86eec-8f45-4449-a363-cb195f58abbd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.612908 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ceph" (OuterVolumeSpecName: "ceph") pod "62f86eec-8f45-4449-a363-cb195f58abbd" (UID: "62f86eec-8f45-4449-a363-cb195f58abbd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.630704 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62f86eec-8f45-4449-a363-cb195f58abbd" (UID: "62f86eec-8f45-4449-a363-cb195f58abbd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.641774 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-inventory" (OuterVolumeSpecName: "inventory") pod "62f86eec-8f45-4449-a363-cb195f58abbd" (UID: "62f86eec-8f45-4449-a363-cb195f58abbd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.707304 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.707350 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.707364 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.707376 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62f86eec-8f45-4449-a363-cb195f58abbd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.707387 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2clw4\" (UniqueName: \"kubernetes.io/projected/62f86eec-8f45-4449-a363-cb195f58abbd-kube-api-access-2clw4\") on node \"crc\" DevicePath \"\"" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.949402 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" event={"ID":"62f86eec-8f45-4449-a363-cb195f58abbd","Type":"ContainerDied","Data":"1c0a9a461edb8f9154543bc69a62f3e2a4a9b09d915b3b232e2c0b791d3cc887"} Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.949443 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c0a9a461edb8f9154543bc69a62f3e2a4a9b09d915b3b232e2c0b791d3cc887" Oct 12 21:07:03 crc kubenswrapper[4773]: I1012 21:07:03.949450 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dlkwd" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.051638 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb"] Oct 12 21:07:04 crc kubenswrapper[4773]: E1012 21:07:04.052247 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f86eec-8f45-4449-a363-cb195f58abbd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.052269 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f86eec-8f45-4449-a363-cb195f58abbd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.052466 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f86eec-8f45-4449-a363-cb195f58abbd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.053144 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.054955 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.055119 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.057917 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.058027 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.058134 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.058211 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.064691 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.072857 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb"] Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.115551 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.115614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.115645 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qm5p\" (UniqueName: \"kubernetes.io/projected/0f130afc-51e5-494f-b915-7ec573c760b1-kube-api-access-2qm5p\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.115694 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.115740 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.115759 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.115798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.217347 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.217398 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qm5p\" (UniqueName: \"kubernetes.io/projected/0f130afc-51e5-494f-b915-7ec573c760b1-kube-api-access-2qm5p\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.217466 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.217503 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.217521 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.217580 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.217668 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.222418 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.222669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.223107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.224417 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.224625 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.226129 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.234365 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qm5p\" (UniqueName: \"kubernetes.io/projected/0f130afc-51e5-494f-b915-7ec573c760b1-kube-api-access-2qm5p\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.368106 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:07:04 crc kubenswrapper[4773]: I1012 21:07:04.955235 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb"] Oct 12 21:07:05 crc kubenswrapper[4773]: I1012 21:07:05.967685 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" event={"ID":"0f130afc-51e5-494f-b915-7ec573c760b1","Type":"ContainerStarted","Data":"5233cdbf068f6b1b78e63978031a62d735a95f0f7b5ef07293c8948c695dd8d3"} Oct 12 21:07:05 crc kubenswrapper[4773]: I1012 21:07:05.967762 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" event={"ID":"0f130afc-51e5-494f-b915-7ec573c760b1","Type":"ContainerStarted","Data":"602d8c4652c6f98a23dea658dda3e69b1d713af6d86902f610d84875db326b44"} Oct 12 21:07:05 crc kubenswrapper[4773]: I1012 21:07:05.991200 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" podStartSLOduration=1.337018214 podStartE2EDuration="1.991178028s" podCreationTimestamp="2025-10-12 21:07:04 +0000 UTC" firstStartedPulling="2025-10-12 21:07:04.972797437 +0000 UTC m=+2573.209095997" lastFinishedPulling="2025-10-12 21:07:05.626957211 +0000 UTC m=+2573.863255811" observedRunningTime="2025-10-12 21:07:05.98749846 +0000 UTC m=+2574.223797020" watchObservedRunningTime="2025-10-12 21:07:05.991178028 +0000 UTC m=+2574.227476608" Oct 12 21:08:17 crc kubenswrapper[4773]: I1012 21:08:17.655600 4773 generic.go:334] "Generic (PLEG): container finished" podID="0f130afc-51e5-494f-b915-7ec573c760b1" containerID="5233cdbf068f6b1b78e63978031a62d735a95f0f7b5ef07293c8948c695dd8d3" exitCode=0 Oct 12 21:08:17 crc kubenswrapper[4773]: I1012 21:08:17.655838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" event={"ID":"0f130afc-51e5-494f-b915-7ec573c760b1","Type":"ContainerDied","Data":"5233cdbf068f6b1b78e63978031a62d735a95f0f7b5ef07293c8948c695dd8d3"} Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.075249 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.170263 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qm5p\" (UniqueName: \"kubernetes.io/projected/0f130afc-51e5-494f-b915-7ec573c760b1-kube-api-access-2qm5p\") pod \"0f130afc-51e5-494f-b915-7ec573c760b1\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.170825 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-inventory\") pod \"0f130afc-51e5-494f-b915-7ec573c760b1\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.170962 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-metadata-combined-ca-bundle\") pod \"0f130afc-51e5-494f-b915-7ec573c760b1\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.171101 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ssh-key\") pod \"0f130afc-51e5-494f-b915-7ec573c760b1\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.171241 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ceph\") pod \"0f130afc-51e5-494f-b915-7ec573c760b1\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.171371 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-nova-metadata-neutron-config-0\") pod \"0f130afc-51e5-494f-b915-7ec573c760b1\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.171678 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"0f130afc-51e5-494f-b915-7ec573c760b1\" (UID: \"0f130afc-51e5-494f-b915-7ec573c760b1\") " Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.176011 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ceph" (OuterVolumeSpecName: "ceph") pod "0f130afc-51e5-494f-b915-7ec573c760b1" (UID: "0f130afc-51e5-494f-b915-7ec573c760b1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.190567 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0f130afc-51e5-494f-b915-7ec573c760b1" (UID: "0f130afc-51e5-494f-b915-7ec573c760b1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.190867 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f130afc-51e5-494f-b915-7ec573c760b1-kube-api-access-2qm5p" (OuterVolumeSpecName: "kube-api-access-2qm5p") pod "0f130afc-51e5-494f-b915-7ec573c760b1" (UID: "0f130afc-51e5-494f-b915-7ec573c760b1"). InnerVolumeSpecName "kube-api-access-2qm5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.194021 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "0f130afc-51e5-494f-b915-7ec573c760b1" (UID: "0f130afc-51e5-494f-b915-7ec573c760b1"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.201200 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "0f130afc-51e5-494f-b915-7ec573c760b1" (UID: "0f130afc-51e5-494f-b915-7ec573c760b1"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.206826 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-inventory" (OuterVolumeSpecName: "inventory") pod "0f130afc-51e5-494f-b915-7ec573c760b1" (UID: "0f130afc-51e5-494f-b915-7ec573c760b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.218945 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0f130afc-51e5-494f-b915-7ec573c760b1" (UID: "0f130afc-51e5-494f-b915-7ec573c760b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.274457 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.274490 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qm5p\" (UniqueName: \"kubernetes.io/projected/0f130afc-51e5-494f-b915-7ec573c760b1-kube-api-access-2qm5p\") on node \"crc\" DevicePath \"\"" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.274504 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.274512 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.274522 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.274530 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.274539 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0f130afc-51e5-494f-b915-7ec573c760b1-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.676002 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" event={"ID":"0f130afc-51e5-494f-b915-7ec573c760b1","Type":"ContainerDied","Data":"602d8c4652c6f98a23dea658dda3e69b1d713af6d86902f610d84875db326b44"} Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.676044 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="602d8c4652c6f98a23dea658dda3e69b1d713af6d86902f610d84875db326b44" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.676055 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.804162 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v"] Oct 12 21:08:19 crc kubenswrapper[4773]: E1012 21:08:19.804768 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f130afc-51e5-494f-b915-7ec573c760b1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.804793 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f130afc-51e5-494f-b915-7ec573c760b1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.805033 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f130afc-51e5-494f-b915-7ec573c760b1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.805893 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.814865 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.815073 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.814877 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.815948 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.816575 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.818675 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.828468 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v"] Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.985124 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.985194 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.985238 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.985295 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.985354 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:19 crc kubenswrapper[4773]: I1012 21:08:19.985381 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4vbf\" (UniqueName: \"kubernetes.io/projected/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-kube-api-access-g4vbf\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.086695 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.086790 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.086822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4vbf\" (UniqueName: \"kubernetes.io/projected/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-kube-api-access-g4vbf\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.086872 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.086896 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.086927 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.090997 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.096128 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.096356 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.096640 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.101906 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.108162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4vbf\" (UniqueName: \"kubernetes.io/projected/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-kube-api-access-g4vbf\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.143537 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.664100 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v"] Oct 12 21:08:20 crc kubenswrapper[4773]: I1012 21:08:20.683937 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" event={"ID":"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c","Type":"ContainerStarted","Data":"35f32213f763f938c7b16ed7cc0310e17121dc1e095da51c6bf0a48b9ae2f93e"} Oct 12 21:08:21 crc kubenswrapper[4773]: I1012 21:08:21.710084 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" event={"ID":"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c","Type":"ContainerStarted","Data":"d0de99ccc5467ac6c229a909a2c1e127a8e09da98c0311277f4e84fd5c673d9e"} Oct 12 21:08:21 crc kubenswrapper[4773]: I1012 21:08:21.735029 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" podStartSLOduration=2.197166982 podStartE2EDuration="2.735009032s" podCreationTimestamp="2025-10-12 21:08:19 +0000 UTC" firstStartedPulling="2025-10-12 21:08:20.674569446 +0000 UTC m=+2648.910868006" lastFinishedPulling="2025-10-12 21:08:21.212411496 +0000 UTC m=+2649.448710056" observedRunningTime="2025-10-12 21:08:21.733936224 +0000 UTC m=+2649.970234784" watchObservedRunningTime="2025-10-12 21:08:21.735009032 +0000 UTC m=+2649.971307602" Oct 12 21:08:58 crc kubenswrapper[4773]: I1012 21:08:58.670017 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:08:58 crc kubenswrapper[4773]: I1012 21:08:58.670868 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:09:28 crc kubenswrapper[4773]: I1012 21:09:28.669241 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:09:28 crc kubenswrapper[4773]: I1012 21:09:28.670192 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:09:58 crc kubenswrapper[4773]: I1012 21:09:58.669626 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:09:58 crc kubenswrapper[4773]: I1012 21:09:58.671001 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:09:58 crc kubenswrapper[4773]: I1012 21:09:58.671116 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 21:09:58 crc kubenswrapper[4773]: I1012 21:09:58.671860 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e9d2a2e10726cb6bc3146bbd48dd0b1abfe6c4848fd3baeeeb35db6671eae8e"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 21:09:58 crc kubenswrapper[4773]: I1012 21:09:58.672017 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://6e9d2a2e10726cb6bc3146bbd48dd0b1abfe6c4848fd3baeeeb35db6671eae8e" gracePeriod=600 Oct 12 21:09:59 crc kubenswrapper[4773]: I1012 21:09:59.665967 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="6e9d2a2e10726cb6bc3146bbd48dd0b1abfe6c4848fd3baeeeb35db6671eae8e" exitCode=0 Oct 12 21:09:59 crc kubenswrapper[4773]: I1012 21:09:59.666039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"6e9d2a2e10726cb6bc3146bbd48dd0b1abfe6c4848fd3baeeeb35db6671eae8e"} Oct 12 21:09:59 crc kubenswrapper[4773]: I1012 21:09:59.666564 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e"} Oct 12 21:09:59 crc kubenswrapper[4773]: I1012 21:09:59.666591 4773 scope.go:117] "RemoveContainer" containerID="9be54bdb9c93d704c619104d92214dd880330f21440ab2b814935d5ced48dce8" Oct 12 21:10:11 crc kubenswrapper[4773]: I1012 21:10:11.998801 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qlkts"] Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.001107 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.054953 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlkts"] Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.189995 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-catalog-content\") pod \"redhat-marketplace-qlkts\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.190044 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7mmn\" (UniqueName: \"kubernetes.io/projected/ae400c58-63a4-4c5b-9b35-09a8cea189f1-kube-api-access-l7mmn\") pod \"redhat-marketplace-qlkts\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.190066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-utilities\") pod \"redhat-marketplace-qlkts\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.292278 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-catalog-content\") pod \"redhat-marketplace-qlkts\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.292335 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7mmn\" (UniqueName: \"kubernetes.io/projected/ae400c58-63a4-4c5b-9b35-09a8cea189f1-kube-api-access-l7mmn\") pod \"redhat-marketplace-qlkts\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.292361 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-utilities\") pod \"redhat-marketplace-qlkts\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.292812 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-catalog-content\") pod \"redhat-marketplace-qlkts\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.292878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-utilities\") pod \"redhat-marketplace-qlkts\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.310355 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7mmn\" (UniqueName: \"kubernetes.io/projected/ae400c58-63a4-4c5b-9b35-09a8cea189f1-kube-api-access-l7mmn\") pod \"redhat-marketplace-qlkts\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.318295 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:12 crc kubenswrapper[4773]: I1012 21:10:12.836151 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlkts"] Oct 12 21:10:13 crc kubenswrapper[4773]: I1012 21:10:13.806340 4773 generic.go:334] "Generic (PLEG): container finished" podID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerID="62c93a46c60d117d46733cc20979c5409f8613d1841efbc0e0cb57aa66256414" exitCode=0 Oct 12 21:10:13 crc kubenswrapper[4773]: I1012 21:10:13.806405 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlkts" event={"ID":"ae400c58-63a4-4c5b-9b35-09a8cea189f1","Type":"ContainerDied","Data":"62c93a46c60d117d46733cc20979c5409f8613d1841efbc0e0cb57aa66256414"} Oct 12 21:10:13 crc kubenswrapper[4773]: I1012 21:10:13.806880 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlkts" event={"ID":"ae400c58-63a4-4c5b-9b35-09a8cea189f1","Type":"ContainerStarted","Data":"8aa82e4cd6b49fe20c4bfee88120f783b8b5b98d348013e6ae6708a2a72d442c"} Oct 12 21:10:13 crc kubenswrapper[4773]: I1012 21:10:13.808584 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 21:10:14 crc kubenswrapper[4773]: I1012 21:10:14.821970 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlkts" event={"ID":"ae400c58-63a4-4c5b-9b35-09a8cea189f1","Type":"ContainerStarted","Data":"90a93e2fe4ae5e51c255011b499630c49c147f661b3f39e195690810aeb45815"} Oct 12 21:10:15 crc kubenswrapper[4773]: I1012 21:10:15.837351 4773 generic.go:334] "Generic (PLEG): container finished" podID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerID="90a93e2fe4ae5e51c255011b499630c49c147f661b3f39e195690810aeb45815" exitCode=0 Oct 12 21:10:15 crc kubenswrapper[4773]: I1012 21:10:15.837608 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlkts" event={"ID":"ae400c58-63a4-4c5b-9b35-09a8cea189f1","Type":"ContainerDied","Data":"90a93e2fe4ae5e51c255011b499630c49c147f661b3f39e195690810aeb45815"} Oct 12 21:10:16 crc kubenswrapper[4773]: I1012 21:10:16.848298 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlkts" event={"ID":"ae400c58-63a4-4c5b-9b35-09a8cea189f1","Type":"ContainerStarted","Data":"e8e286bac58eefb49ada4c514b5b754e2db596c3cec766ddbf6f8bd144f02c8f"} Oct 12 21:10:16 crc kubenswrapper[4773]: I1012 21:10:16.883263 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qlkts" podStartSLOduration=3.449237174 podStartE2EDuration="5.88324452s" podCreationTimestamp="2025-10-12 21:10:11 +0000 UTC" firstStartedPulling="2025-10-12 21:10:13.808378661 +0000 UTC m=+2762.044677211" lastFinishedPulling="2025-10-12 21:10:16.242385987 +0000 UTC m=+2764.478684557" observedRunningTime="2025-10-12 21:10:16.869338145 +0000 UTC m=+2765.105636755" watchObservedRunningTime="2025-10-12 21:10:16.88324452 +0000 UTC m=+2765.119543080" Oct 12 21:10:22 crc kubenswrapper[4773]: I1012 21:10:22.318545 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:22 crc kubenswrapper[4773]: I1012 21:10:22.320033 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:22 crc kubenswrapper[4773]: I1012 21:10:22.367586 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:22 crc kubenswrapper[4773]: I1012 21:10:22.989771 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:23 crc kubenswrapper[4773]: I1012 21:10:23.038191 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlkts"] Oct 12 21:10:24 crc kubenswrapper[4773]: I1012 21:10:24.932917 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qlkts" podUID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerName="registry-server" containerID="cri-o://e8e286bac58eefb49ada4c514b5b754e2db596c3cec766ddbf6f8bd144f02c8f" gracePeriod=2 Oct 12 21:10:25 crc kubenswrapper[4773]: I1012 21:10:25.943775 4773 generic.go:334] "Generic (PLEG): container finished" podID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerID="e8e286bac58eefb49ada4c514b5b754e2db596c3cec766ddbf6f8bd144f02c8f" exitCode=0 Oct 12 21:10:25 crc kubenswrapper[4773]: I1012 21:10:25.943827 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlkts" event={"ID":"ae400c58-63a4-4c5b-9b35-09a8cea189f1","Type":"ContainerDied","Data":"e8e286bac58eefb49ada4c514b5b754e2db596c3cec766ddbf6f8bd144f02c8f"} Oct 12 21:10:25 crc kubenswrapper[4773]: I1012 21:10:25.943865 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qlkts" event={"ID":"ae400c58-63a4-4c5b-9b35-09a8cea189f1","Type":"ContainerDied","Data":"8aa82e4cd6b49fe20c4bfee88120f783b8b5b98d348013e6ae6708a2a72d442c"} Oct 12 21:10:25 crc kubenswrapper[4773]: I1012 21:10:25.943893 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa82e4cd6b49fe20c4bfee88120f783b8b5b98d348013e6ae6708a2a72d442c" Oct 12 21:10:25 crc kubenswrapper[4773]: I1012 21:10:25.968529 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.164355 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-utilities\") pod \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.164539 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7mmn\" (UniqueName: \"kubernetes.io/projected/ae400c58-63a4-4c5b-9b35-09a8cea189f1-kube-api-access-l7mmn\") pod \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.164603 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-catalog-content\") pod \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\" (UID: \"ae400c58-63a4-4c5b-9b35-09a8cea189f1\") " Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.166054 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-utilities" (OuterVolumeSpecName: "utilities") pod "ae400c58-63a4-4c5b-9b35-09a8cea189f1" (UID: "ae400c58-63a4-4c5b-9b35-09a8cea189f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.178093 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae400c58-63a4-4c5b-9b35-09a8cea189f1-kube-api-access-l7mmn" (OuterVolumeSpecName: "kube-api-access-l7mmn") pod "ae400c58-63a4-4c5b-9b35-09a8cea189f1" (UID: "ae400c58-63a4-4c5b-9b35-09a8cea189f1"). InnerVolumeSpecName "kube-api-access-l7mmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.179778 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae400c58-63a4-4c5b-9b35-09a8cea189f1" (UID: "ae400c58-63a4-4c5b-9b35-09a8cea189f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.266529 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7mmn\" (UniqueName: \"kubernetes.io/projected/ae400c58-63a4-4c5b-9b35-09a8cea189f1-kube-api-access-l7mmn\") on node \"crc\" DevicePath \"\"" Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.266558 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.266567 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae400c58-63a4-4c5b-9b35-09a8cea189f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.951593 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qlkts" Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.974775 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlkts"] Oct 12 21:10:26 crc kubenswrapper[4773]: I1012 21:10:26.984510 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qlkts"] Oct 12 21:10:28 crc kubenswrapper[4773]: I1012 21:10:28.491534 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" path="/var/lib/kubelet/pods/ae400c58-63a4-4c5b-9b35-09a8cea189f1/volumes" Oct 12 21:11:58 crc kubenswrapper[4773]: I1012 21:11:58.669247 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:11:58 crc kubenswrapper[4773]: I1012 21:11:58.670013 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:12:28 crc kubenswrapper[4773]: I1012 21:12:28.670041 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:12:28 crc kubenswrapper[4773]: I1012 21:12:28.670894 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:12:58 crc kubenswrapper[4773]: I1012 21:12:58.669779 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:12:58 crc kubenswrapper[4773]: I1012 21:12:58.670267 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:12:58 crc kubenswrapper[4773]: I1012 21:12:58.670318 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 21:12:58 crc kubenswrapper[4773]: I1012 21:12:58.671119 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 21:12:58 crc kubenswrapper[4773]: I1012 21:12:58.671173 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" gracePeriod=600 Oct 12 21:12:58 crc kubenswrapper[4773]: E1012 21:12:58.798030 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:12:59 crc kubenswrapper[4773]: I1012 21:12:59.437652 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" exitCode=0 Oct 12 21:12:59 crc kubenswrapper[4773]: I1012 21:12:59.437743 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e"} Oct 12 21:12:59 crc kubenswrapper[4773]: I1012 21:12:59.438031 4773 scope.go:117] "RemoveContainer" containerID="6e9d2a2e10726cb6bc3146bbd48dd0b1abfe6c4848fd3baeeeb35db6671eae8e" Oct 12 21:12:59 crc kubenswrapper[4773]: I1012 21:12:59.438370 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:12:59 crc kubenswrapper[4773]: E1012 21:12:59.438625 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:13:14 crc kubenswrapper[4773]: I1012 21:13:14.481449 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:13:14 crc kubenswrapper[4773]: E1012 21:13:14.482040 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:13:29 crc kubenswrapper[4773]: I1012 21:13:29.482218 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:13:29 crc kubenswrapper[4773]: E1012 21:13:29.484687 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:13:43 crc kubenswrapper[4773]: I1012 21:13:43.481845 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:13:43 crc kubenswrapper[4773]: E1012 21:13:43.482904 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:13:54 crc kubenswrapper[4773]: I1012 21:13:54.483502 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:13:54 crc kubenswrapper[4773]: E1012 21:13:54.486038 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:14:01 crc kubenswrapper[4773]: I1012 21:14:01.050538 4773 generic.go:334] "Generic (PLEG): container finished" podID="e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" containerID="d0de99ccc5467ac6c229a909a2c1e127a8e09da98c0311277f4e84fd5c673d9e" exitCode=0 Oct 12 21:14:01 crc kubenswrapper[4773]: I1012 21:14:01.050616 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" event={"ID":"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c","Type":"ContainerDied","Data":"d0de99ccc5467ac6c229a909a2c1e127a8e09da98c0311277f4e84fd5c673d9e"} Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.539607 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.593805 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ssh-key\") pod \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.593866 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-combined-ca-bundle\") pod \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.593906 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-inventory\") pod \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.594041 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ceph\") pod \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.594079 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-secret-0\") pod \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.594121 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4vbf\" (UniqueName: \"kubernetes.io/projected/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-kube-api-access-g4vbf\") pod \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\" (UID: \"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c\") " Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.606257 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-kube-api-access-g4vbf" (OuterVolumeSpecName: "kube-api-access-g4vbf") pod "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" (UID: "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c"). InnerVolumeSpecName "kube-api-access-g4vbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.610068 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" (UID: "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.634280 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ceph" (OuterVolumeSpecName: "ceph") pod "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" (UID: "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.659660 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" (UID: "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.660494 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" (UID: "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.660881 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-inventory" (OuterVolumeSpecName: "inventory") pod "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" (UID: "e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.696451 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.696490 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.696506 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.696519 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.696532 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:14:02 crc kubenswrapper[4773]: I1012 21:14:02.696545 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4vbf\" (UniqueName: \"kubernetes.io/projected/e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c-kube-api-access-g4vbf\") on node \"crc\" DevicePath \"\"" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.074149 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" event={"ID":"e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c","Type":"ContainerDied","Data":"35f32213f763f938c7b16ed7cc0310e17121dc1e095da51c6bf0a48b9ae2f93e"} Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.074227 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35f32213f763f938c7b16ed7cc0310e17121dc1e095da51c6bf0a48b9ae2f93e" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.074332 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.256918 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq"] Oct 12 21:14:03 crc kubenswrapper[4773]: E1012 21:14:03.257233 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.257250 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 12 21:14:03 crc kubenswrapper[4773]: E1012 21:14:03.257274 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerName="registry-server" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.257280 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerName="registry-server" Oct 12 21:14:03 crc kubenswrapper[4773]: E1012 21:14:03.257295 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerName="extract-utilities" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.257300 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerName="extract-utilities" Oct 12 21:14:03 crc kubenswrapper[4773]: E1012 21:14:03.257309 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerName="extract-content" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.257315 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerName="extract-content" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.257469 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.257490 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae400c58-63a4-4c5b-9b35-09a8cea189f1" containerName="registry-server" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.258071 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.260489 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9nkqm" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.260833 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.261422 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.261700 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.262049 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.262369 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.262843 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.263191 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.263875 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.278551 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq"] Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.307251 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.307310 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.307445 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.307499 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.307597 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.307669 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.307693 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.307823 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.307882 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tgrg\" (UniqueName: \"kubernetes.io/projected/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-kube-api-access-7tgrg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.308055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.308102 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.409607 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.409662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.409689 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.409737 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.410768 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tgrg\" (UniqueName: \"kubernetes.io/projected/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-kube-api-access-7tgrg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.411191 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.411222 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.411947 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.412552 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.412584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.412687 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.412733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.413377 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.413951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.414269 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.416858 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.420644 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.421505 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.421757 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.421796 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.423157 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.457165 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tgrg\" (UniqueName: \"kubernetes.io/projected/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-kube-api-access-7tgrg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:03 crc kubenswrapper[4773]: I1012 21:14:03.581063 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:14:04 crc kubenswrapper[4773]: I1012 21:14:04.100032 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq"] Oct 12 21:14:05 crc kubenswrapper[4773]: I1012 21:14:05.095797 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" event={"ID":"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac","Type":"ContainerStarted","Data":"6e8b61c72edb04c057b0cb2f535d0409ecd301f5b1272e7b78126fc97a5ce9b2"} Oct 12 21:14:05 crc kubenswrapper[4773]: I1012 21:14:05.096155 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" event={"ID":"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac","Type":"ContainerStarted","Data":"550bbae69a55d7a19c187ca43370024cf64b700718e52b3fc81db89cb35c2e0d"} Oct 12 21:14:05 crc kubenswrapper[4773]: I1012 21:14:05.114861 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" podStartSLOduration=1.6331620980000001 podStartE2EDuration="2.114840245s" podCreationTimestamp="2025-10-12 21:14:03 +0000 UTC" firstStartedPulling="2025-10-12 21:14:04.112163592 +0000 UTC m=+2992.348462152" lastFinishedPulling="2025-10-12 21:14:04.593841709 +0000 UTC m=+2992.830140299" observedRunningTime="2025-10-12 21:14:05.113233651 +0000 UTC m=+2993.349532211" watchObservedRunningTime="2025-10-12 21:14:05.114840245 +0000 UTC m=+2993.351138795" Oct 12 21:14:08 crc kubenswrapper[4773]: I1012 21:14:08.482158 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:14:08 crc kubenswrapper[4773]: E1012 21:14:08.483301 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:14:23 crc kubenswrapper[4773]: I1012 21:14:23.481384 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:14:23 crc kubenswrapper[4773]: E1012 21:14:23.482220 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.563452 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xq4s6"] Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.568350 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.587406 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xq4s6"] Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.681338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-catalog-content\") pod \"redhat-operators-xq4s6\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.681486 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-utilities\") pod \"redhat-operators-xq4s6\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.681522 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7n7n\" (UniqueName: \"kubernetes.io/projected/bf6aee0a-905f-418a-ace9-c09497895ea2-kube-api-access-r7n7n\") pod \"redhat-operators-xq4s6\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.783543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-utilities\") pod \"redhat-operators-xq4s6\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.783611 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7n7n\" (UniqueName: \"kubernetes.io/projected/bf6aee0a-905f-418a-ace9-c09497895ea2-kube-api-access-r7n7n\") pod \"redhat-operators-xq4s6\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.783704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-catalog-content\") pod \"redhat-operators-xq4s6\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.784323 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-catalog-content\") pod \"redhat-operators-xq4s6\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.784661 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-utilities\") pod \"redhat-operators-xq4s6\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.802139 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7n7n\" (UniqueName: \"kubernetes.io/projected/bf6aee0a-905f-418a-ace9-c09497895ea2-kube-api-access-r7n7n\") pod \"redhat-operators-xq4s6\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:30 crc kubenswrapper[4773]: I1012 21:14:30.891266 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:31 crc kubenswrapper[4773]: I1012 21:14:31.359850 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xq4s6"] Oct 12 21:14:32 crc kubenswrapper[4773]: I1012 21:14:32.351077 4773 generic.go:334] "Generic (PLEG): container finished" podID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerID="aa1e086830698327baacb3bcc8ba6cc870bad45db1e358f875b1e9cf8557983e" exitCode=0 Oct 12 21:14:32 crc kubenswrapper[4773]: I1012 21:14:32.351132 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq4s6" event={"ID":"bf6aee0a-905f-418a-ace9-c09497895ea2","Type":"ContainerDied","Data":"aa1e086830698327baacb3bcc8ba6cc870bad45db1e358f875b1e9cf8557983e"} Oct 12 21:14:32 crc kubenswrapper[4773]: I1012 21:14:32.351383 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq4s6" event={"ID":"bf6aee0a-905f-418a-ace9-c09497895ea2","Type":"ContainerStarted","Data":"ea1fdd609b28fddfc41584e7fc8303b91acd2a66882543801e1381904c1f06ec"} Oct 12 21:14:34 crc kubenswrapper[4773]: I1012 21:14:34.370683 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq4s6" event={"ID":"bf6aee0a-905f-418a-ace9-c09497895ea2","Type":"ContainerStarted","Data":"d4e762b832d3199b3de4c1783405eff4b7589cd7cedf16bfb4917df1381c8e26"} Oct 12 21:14:35 crc kubenswrapper[4773]: I1012 21:14:35.480892 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:14:35 crc kubenswrapper[4773]: E1012 21:14:35.481446 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:14:36 crc kubenswrapper[4773]: I1012 21:14:36.394131 4773 generic.go:334] "Generic (PLEG): container finished" podID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerID="d4e762b832d3199b3de4c1783405eff4b7589cd7cedf16bfb4917df1381c8e26" exitCode=0 Oct 12 21:14:36 crc kubenswrapper[4773]: I1012 21:14:36.394186 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq4s6" event={"ID":"bf6aee0a-905f-418a-ace9-c09497895ea2","Type":"ContainerDied","Data":"d4e762b832d3199b3de4c1783405eff4b7589cd7cedf16bfb4917df1381c8e26"} Oct 12 21:14:37 crc kubenswrapper[4773]: I1012 21:14:37.405830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq4s6" event={"ID":"bf6aee0a-905f-418a-ace9-c09497895ea2","Type":"ContainerStarted","Data":"d467c418e06b3883ff7c8ba1b9ddda046929ae0dc74b9c61a5ce726e5c08dcc9"} Oct 12 21:14:37 crc kubenswrapper[4773]: I1012 21:14:37.431049 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xq4s6" podStartSLOduration=2.974506115 podStartE2EDuration="7.4310282s" podCreationTimestamp="2025-10-12 21:14:30 +0000 UTC" firstStartedPulling="2025-10-12 21:14:32.352589332 +0000 UTC m=+3020.588887892" lastFinishedPulling="2025-10-12 21:14:36.809111417 +0000 UTC m=+3025.045409977" observedRunningTime="2025-10-12 21:14:37.421151176 +0000 UTC m=+3025.657449756" watchObservedRunningTime="2025-10-12 21:14:37.4310282 +0000 UTC m=+3025.667326770" Oct 12 21:14:40 crc kubenswrapper[4773]: I1012 21:14:40.891891 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:40 crc kubenswrapper[4773]: I1012 21:14:40.892234 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:41 crc kubenswrapper[4773]: I1012 21:14:41.940085 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xq4s6" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerName="registry-server" probeResult="failure" output=< Oct 12 21:14:41 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:14:41 crc kubenswrapper[4773]: > Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.003907 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-snffg"] Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.006617 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.018099 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snffg"] Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.141637 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-utilities\") pod \"community-operators-snffg\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.141693 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmbl\" (UniqueName: \"kubernetes.io/projected/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-kube-api-access-rrmbl\") pod \"community-operators-snffg\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.141942 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-catalog-content\") pod \"community-operators-snffg\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.243376 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-catalog-content\") pod \"community-operators-snffg\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.243519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-utilities\") pod \"community-operators-snffg\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.243548 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmbl\" (UniqueName: \"kubernetes.io/projected/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-kube-api-access-rrmbl\") pod \"community-operators-snffg\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.243930 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-catalog-content\") pod \"community-operators-snffg\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.244176 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-utilities\") pod \"community-operators-snffg\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.269597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmbl\" (UniqueName: \"kubernetes.io/projected/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-kube-api-access-rrmbl\") pod \"community-operators-snffg\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.330438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.481038 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:14:49 crc kubenswrapper[4773]: E1012 21:14:49.481282 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:14:49 crc kubenswrapper[4773]: I1012 21:14:49.916492 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snffg"] Oct 12 21:14:50 crc kubenswrapper[4773]: I1012 21:14:50.526249 4773 generic.go:334] "Generic (PLEG): container finished" podID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerID="f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c" exitCode=0 Oct 12 21:14:50 crc kubenswrapper[4773]: I1012 21:14:50.526533 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snffg" event={"ID":"3d016e5f-ad13-44c7-ba2c-328bff3a7a96","Type":"ContainerDied","Data":"f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c"} Oct 12 21:14:50 crc kubenswrapper[4773]: I1012 21:14:50.526563 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snffg" event={"ID":"3d016e5f-ad13-44c7-ba2c-328bff3a7a96","Type":"ContainerStarted","Data":"d0c04b6b7eb7e6abdb82b48c15a59f1fc3d31b93c6a24734573fde4505784c8d"} Oct 12 21:14:50 crc kubenswrapper[4773]: I1012 21:14:50.951525 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:50 crc kubenswrapper[4773]: I1012 21:14:50.999260 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:52 crc kubenswrapper[4773]: I1012 21:14:52.546961 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snffg" event={"ID":"3d016e5f-ad13-44c7-ba2c-328bff3a7a96","Type":"ContainerStarted","Data":"d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70"} Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.187849 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xq4s6"] Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.188366 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xq4s6" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerName="registry-server" containerID="cri-o://d467c418e06b3883ff7c8ba1b9ddda046929ae0dc74b9c61a5ce726e5c08dcc9" gracePeriod=2 Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.556976 4773 generic.go:334] "Generic (PLEG): container finished" podID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerID="d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70" exitCode=0 Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.557043 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snffg" event={"ID":"3d016e5f-ad13-44c7-ba2c-328bff3a7a96","Type":"ContainerDied","Data":"d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70"} Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.561416 4773 generic.go:334] "Generic (PLEG): container finished" podID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerID="d467c418e06b3883ff7c8ba1b9ddda046929ae0dc74b9c61a5ce726e5c08dcc9" exitCode=0 Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.561454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq4s6" event={"ID":"bf6aee0a-905f-418a-ace9-c09497895ea2","Type":"ContainerDied","Data":"d467c418e06b3883ff7c8ba1b9ddda046929ae0dc74b9c61a5ce726e5c08dcc9"} Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.561480 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq4s6" event={"ID":"bf6aee0a-905f-418a-ace9-c09497895ea2","Type":"ContainerDied","Data":"ea1fdd609b28fddfc41584e7fc8303b91acd2a66882543801e1381904c1f06ec"} Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.561490 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea1fdd609b28fddfc41584e7fc8303b91acd2a66882543801e1381904c1f06ec" Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.630912 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.734591 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-utilities\") pod \"bf6aee0a-905f-418a-ace9-c09497895ea2\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.734631 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-catalog-content\") pod \"bf6aee0a-905f-418a-ace9-c09497895ea2\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.734775 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7n7n\" (UniqueName: \"kubernetes.io/projected/bf6aee0a-905f-418a-ace9-c09497895ea2-kube-api-access-r7n7n\") pod \"bf6aee0a-905f-418a-ace9-c09497895ea2\" (UID: \"bf6aee0a-905f-418a-ace9-c09497895ea2\") " Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.736343 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-utilities" (OuterVolumeSpecName: "utilities") pod "bf6aee0a-905f-418a-ace9-c09497895ea2" (UID: "bf6aee0a-905f-418a-ace9-c09497895ea2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.742743 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6aee0a-905f-418a-ace9-c09497895ea2-kube-api-access-r7n7n" (OuterVolumeSpecName: "kube-api-access-r7n7n") pod "bf6aee0a-905f-418a-ace9-c09497895ea2" (UID: "bf6aee0a-905f-418a-ace9-c09497895ea2"). InnerVolumeSpecName "kube-api-access-r7n7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.816189 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf6aee0a-905f-418a-ace9-c09497895ea2" (UID: "bf6aee0a-905f-418a-ace9-c09497895ea2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.836761 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7n7n\" (UniqueName: \"kubernetes.io/projected/bf6aee0a-905f-418a-ace9-c09497895ea2-kube-api-access-r7n7n\") on node \"crc\" DevicePath \"\"" Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.836808 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:14:53 crc kubenswrapper[4773]: I1012 21:14:53.836821 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6aee0a-905f-418a-ace9-c09497895ea2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:14:54 crc kubenswrapper[4773]: I1012 21:14:54.576661 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snffg" event={"ID":"3d016e5f-ad13-44c7-ba2c-328bff3a7a96","Type":"ContainerStarted","Data":"ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc"} Oct 12 21:14:54 crc kubenswrapper[4773]: I1012 21:14:54.576684 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq4s6" Oct 12 21:14:54 crc kubenswrapper[4773]: I1012 21:14:54.602107 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-snffg" podStartSLOduration=3.098767366 podStartE2EDuration="6.602091829s" podCreationTimestamp="2025-10-12 21:14:48 +0000 UTC" firstStartedPulling="2025-10-12 21:14:50.528732671 +0000 UTC m=+3038.765031231" lastFinishedPulling="2025-10-12 21:14:54.032057134 +0000 UTC m=+3042.268355694" observedRunningTime="2025-10-12 21:14:54.591708962 +0000 UTC m=+3042.828007522" watchObservedRunningTime="2025-10-12 21:14:54.602091829 +0000 UTC m=+3042.838390389" Oct 12 21:14:54 crc kubenswrapper[4773]: I1012 21:14:54.622769 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xq4s6"] Oct 12 21:14:54 crc kubenswrapper[4773]: I1012 21:14:54.632039 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xq4s6"] Oct 12 21:14:56 crc kubenswrapper[4773]: I1012 21:14:56.505863 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" path="/var/lib/kubelet/pods/bf6aee0a-905f-418a-ace9-c09497895ea2/volumes" Oct 12 21:14:59 crc kubenswrapper[4773]: I1012 21:14:59.331832 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:59 crc kubenswrapper[4773]: I1012 21:14:59.332200 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:59 crc kubenswrapper[4773]: I1012 21:14:59.398661 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:59 crc kubenswrapper[4773]: I1012 21:14:59.662142 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-snffg" Oct 12 21:14:59 crc kubenswrapper[4773]: I1012 21:14:59.715295 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snffg"] Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.165539 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv"] Oct 12 21:15:00 crc kubenswrapper[4773]: E1012 21:15:00.165999 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerName="extract-utilities" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.166019 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerName="extract-utilities" Oct 12 21:15:00 crc kubenswrapper[4773]: E1012 21:15:00.166038 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerName="extract-content" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.166047 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerName="extract-content" Oct 12 21:15:00 crc kubenswrapper[4773]: E1012 21:15:00.166066 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerName="registry-server" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.166073 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerName="registry-server" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.166260 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6aee0a-905f-418a-ace9-c09497895ea2" containerName="registry-server" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.166861 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.169182 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.173984 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.176660 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv"] Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.267814 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2fmk\" (UniqueName: \"kubernetes.io/projected/b3fcd410-40c1-4974-a418-83a1e46edd10-kube-api-access-k2fmk\") pod \"collect-profiles-29338395-9z7nv\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.267906 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3fcd410-40c1-4974-a418-83a1e46edd10-secret-volume\") pod \"collect-profiles-29338395-9z7nv\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.267980 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3fcd410-40c1-4974-a418-83a1e46edd10-config-volume\") pod \"collect-profiles-29338395-9z7nv\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.369338 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2fmk\" (UniqueName: \"kubernetes.io/projected/b3fcd410-40c1-4974-a418-83a1e46edd10-kube-api-access-k2fmk\") pod \"collect-profiles-29338395-9z7nv\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.369446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3fcd410-40c1-4974-a418-83a1e46edd10-secret-volume\") pod \"collect-profiles-29338395-9z7nv\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.369551 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3fcd410-40c1-4974-a418-83a1e46edd10-config-volume\") pod \"collect-profiles-29338395-9z7nv\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.370696 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3fcd410-40c1-4974-a418-83a1e46edd10-config-volume\") pod \"collect-profiles-29338395-9z7nv\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.384490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3fcd410-40c1-4974-a418-83a1e46edd10-secret-volume\") pod \"collect-profiles-29338395-9z7nv\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.391657 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2fmk\" (UniqueName: \"kubernetes.io/projected/b3fcd410-40c1-4974-a418-83a1e46edd10-kube-api-access-k2fmk\") pod \"collect-profiles-29338395-9z7nv\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:00 crc kubenswrapper[4773]: I1012 21:15:00.542979 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:01 crc kubenswrapper[4773]: I1012 21:15:01.013595 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv"] Oct 12 21:15:01 crc kubenswrapper[4773]: I1012 21:15:01.636388 4773 generic.go:334] "Generic (PLEG): container finished" podID="b3fcd410-40c1-4974-a418-83a1e46edd10" containerID="cb44a62f6f1f99aed4f7084640ddfcdee978b588600038f4f525353895d8c102" exitCode=0 Oct 12 21:15:01 crc kubenswrapper[4773]: I1012 21:15:01.636511 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" event={"ID":"b3fcd410-40c1-4974-a418-83a1e46edd10","Type":"ContainerDied","Data":"cb44a62f6f1f99aed4f7084640ddfcdee978b588600038f4f525353895d8c102"} Oct 12 21:15:01 crc kubenswrapper[4773]: I1012 21:15:01.636731 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" event={"ID":"b3fcd410-40c1-4974-a418-83a1e46edd10","Type":"ContainerStarted","Data":"dd13a34dd270dbcbc058c3456a6f0351f615ae79aeaae7a27c507e2476bb0479"} Oct 12 21:15:01 crc kubenswrapper[4773]: I1012 21:15:01.636878 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-snffg" podUID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerName="registry-server" containerID="cri-o://ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc" gracePeriod=2 Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.030678 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snffg" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.102130 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-utilities\") pod \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.102228 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-catalog-content\") pod \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.102295 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrmbl\" (UniqueName: \"kubernetes.io/projected/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-kube-api-access-rrmbl\") pod \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\" (UID: \"3d016e5f-ad13-44c7-ba2c-328bff3a7a96\") " Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.104079 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-utilities" (OuterVolumeSpecName: "utilities") pod "3d016e5f-ad13-44c7-ba2c-328bff3a7a96" (UID: "3d016e5f-ad13-44c7-ba2c-328bff3a7a96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.110146 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-kube-api-access-rrmbl" (OuterVolumeSpecName: "kube-api-access-rrmbl") pod "3d016e5f-ad13-44c7-ba2c-328bff3a7a96" (UID: "3d016e5f-ad13-44c7-ba2c-328bff3a7a96"). InnerVolumeSpecName "kube-api-access-rrmbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.166517 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d016e5f-ad13-44c7-ba2c-328bff3a7a96" (UID: "3d016e5f-ad13-44c7-ba2c-328bff3a7a96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.203962 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.203988 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.204000 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrmbl\" (UniqueName: \"kubernetes.io/projected/3d016e5f-ad13-44c7-ba2c-328bff3a7a96-kube-api-access-rrmbl\") on node \"crc\" DevicePath \"\"" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.487003 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:15:02 crc kubenswrapper[4773]: E1012 21:15:02.487362 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.656040 4773 generic.go:334] "Generic (PLEG): container finished" podID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerID="ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc" exitCode=0 Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.656267 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snffg" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.657174 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snffg" event={"ID":"3d016e5f-ad13-44c7-ba2c-328bff3a7a96","Type":"ContainerDied","Data":"ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc"} Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.657209 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snffg" event={"ID":"3d016e5f-ad13-44c7-ba2c-328bff3a7a96","Type":"ContainerDied","Data":"d0c04b6b7eb7e6abdb82b48c15a59f1fc3d31b93c6a24734573fde4505784c8d"} Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.657229 4773 scope.go:117] "RemoveContainer" containerID="ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.690426 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snffg"] Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.709935 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-snffg"] Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.716239 4773 scope.go:117] "RemoveContainer" containerID="d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.776689 4773 scope.go:117] "RemoveContainer" containerID="f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.802376 4773 scope.go:117] "RemoveContainer" containerID="ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc" Oct 12 21:15:02 crc kubenswrapper[4773]: E1012 21:15:02.802737 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc\": container with ID starting with ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc not found: ID does not exist" containerID="ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.802766 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc"} err="failed to get container status \"ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc\": rpc error: code = NotFound desc = could not find container \"ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc\": container with ID starting with ffc1ec3fab4dd8925680517001b4112b29b67c7420e0d7143f03f113d953fdbc not found: ID does not exist" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.802788 4773 scope.go:117] "RemoveContainer" containerID="d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70" Oct 12 21:15:02 crc kubenswrapper[4773]: E1012 21:15:02.805086 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70\": container with ID starting with d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70 not found: ID does not exist" containerID="d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.805123 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70"} err="failed to get container status \"d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70\": rpc error: code = NotFound desc = could not find container \"d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70\": container with ID starting with d782c1105c484c520dbf7404a8dff2c014ef9ba84e3193e6ad44f3fcb2750c70 not found: ID does not exist" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.805148 4773 scope.go:117] "RemoveContainer" containerID="f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c" Oct 12 21:15:02 crc kubenswrapper[4773]: E1012 21:15:02.805519 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c\": container with ID starting with f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c not found: ID does not exist" containerID="f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c" Oct 12 21:15:02 crc kubenswrapper[4773]: I1012 21:15:02.805541 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c"} err="failed to get container status \"f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c\": rpc error: code = NotFound desc = could not find container \"f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c\": container with ID starting with f8e8ec07f2e64cc19930ae82859120975b6b98be63eb41d5838ba2b76896642c not found: ID does not exist" Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.056708 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.121590 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3fcd410-40c1-4974-a418-83a1e46edd10-config-volume\") pod \"b3fcd410-40c1-4974-a418-83a1e46edd10\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.121779 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3fcd410-40c1-4974-a418-83a1e46edd10-secret-volume\") pod \"b3fcd410-40c1-4974-a418-83a1e46edd10\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.121877 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2fmk\" (UniqueName: \"kubernetes.io/projected/b3fcd410-40c1-4974-a418-83a1e46edd10-kube-api-access-k2fmk\") pod \"b3fcd410-40c1-4974-a418-83a1e46edd10\" (UID: \"b3fcd410-40c1-4974-a418-83a1e46edd10\") " Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.122291 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3fcd410-40c1-4974-a418-83a1e46edd10-config-volume" (OuterVolumeSpecName: "config-volume") pod "b3fcd410-40c1-4974-a418-83a1e46edd10" (UID: "b3fcd410-40c1-4974-a418-83a1e46edd10"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.125147 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3fcd410-40c1-4974-a418-83a1e46edd10-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b3fcd410-40c1-4974-a418-83a1e46edd10" (UID: "b3fcd410-40c1-4974-a418-83a1e46edd10"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.130063 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3fcd410-40c1-4974-a418-83a1e46edd10-kube-api-access-k2fmk" (OuterVolumeSpecName: "kube-api-access-k2fmk") pod "b3fcd410-40c1-4974-a418-83a1e46edd10" (UID: "b3fcd410-40c1-4974-a418-83a1e46edd10"). InnerVolumeSpecName "kube-api-access-k2fmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.224113 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2fmk\" (UniqueName: \"kubernetes.io/projected/b3fcd410-40c1-4974-a418-83a1e46edd10-kube-api-access-k2fmk\") on node \"crc\" DevicePath \"\"" Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.224145 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3fcd410-40c1-4974-a418-83a1e46edd10-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.224154 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3fcd410-40c1-4974-a418-83a1e46edd10-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.667008 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" event={"ID":"b3fcd410-40c1-4974-a418-83a1e46edd10","Type":"ContainerDied","Data":"dd13a34dd270dbcbc058c3456a6f0351f615ae79aeaae7a27c507e2476bb0479"} Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.667215 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd13a34dd270dbcbc058c3456a6f0351f615ae79aeaae7a27c507e2476bb0479" Oct 12 21:15:03 crc kubenswrapper[4773]: I1012 21:15:03.667180 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338395-9z7nv" Oct 12 21:15:04 crc kubenswrapper[4773]: I1012 21:15:04.142905 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng"] Oct 12 21:15:04 crc kubenswrapper[4773]: I1012 21:15:04.152724 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338350-f46ng"] Oct 12 21:15:04 crc kubenswrapper[4773]: I1012 21:15:04.509825 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" path="/var/lib/kubelet/pods/3d016e5f-ad13-44c7-ba2c-328bff3a7a96/volumes" Oct 12 21:15:04 crc kubenswrapper[4773]: I1012 21:15:04.512270 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da223ee3-ddbf-415a-8ad8-fb78a81fe5a0" path="/var/lib/kubelet/pods/da223ee3-ddbf-415a-8ad8-fb78a81fe5a0/volumes" Oct 12 21:15:15 crc kubenswrapper[4773]: I1012 21:15:15.481257 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:15:15 crc kubenswrapper[4773]: E1012 21:15:15.481945 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:15:30 crc kubenswrapper[4773]: I1012 21:15:30.493848 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:15:30 crc kubenswrapper[4773]: E1012 21:15:30.494548 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:15:34 crc kubenswrapper[4773]: I1012 21:15:34.735297 4773 scope.go:117] "RemoveContainer" containerID="ddccae5297268965e325d6e009816496a5ff282ac904811dfdbacdd2456d845f" Oct 12 21:15:44 crc kubenswrapper[4773]: I1012 21:15:44.481730 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:15:44 crc kubenswrapper[4773]: E1012 21:15:44.482349 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:15:56 crc kubenswrapper[4773]: I1012 21:15:56.481473 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:15:56 crc kubenswrapper[4773]: E1012 21:15:56.482993 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:16:11 crc kubenswrapper[4773]: I1012 21:16:11.481766 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:16:11 crc kubenswrapper[4773]: E1012 21:16:11.482679 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:16:22 crc kubenswrapper[4773]: I1012 21:16:22.494832 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:16:22 crc kubenswrapper[4773]: E1012 21:16:22.496039 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:16:34 crc kubenswrapper[4773]: I1012 21:16:34.811424 4773 scope.go:117] "RemoveContainer" containerID="62c93a46c60d117d46733cc20979c5409f8613d1841efbc0e0cb57aa66256414" Oct 12 21:16:34 crc kubenswrapper[4773]: I1012 21:16:34.834055 4773 scope.go:117] "RemoveContainer" containerID="90a93e2fe4ae5e51c255011b499630c49c147f661b3f39e195690810aeb45815" Oct 12 21:16:34 crc kubenswrapper[4773]: I1012 21:16:34.871061 4773 scope.go:117] "RemoveContainer" containerID="e8e286bac58eefb49ada4c514b5b754e2db596c3cec766ddbf6f8bd144f02c8f" Oct 12 21:16:36 crc kubenswrapper[4773]: I1012 21:16:36.481287 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:16:36 crc kubenswrapper[4773]: E1012 21:16:36.481848 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:16:51 crc kubenswrapper[4773]: I1012 21:16:51.482216 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:16:51 crc kubenswrapper[4773]: E1012 21:16:51.483390 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:17:02 crc kubenswrapper[4773]: I1012 21:17:02.498324 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:17:02 crc kubenswrapper[4773]: E1012 21:17:02.499399 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:17:13 crc kubenswrapper[4773]: I1012 21:17:13.481611 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:17:13 crc kubenswrapper[4773]: E1012 21:17:13.482773 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:17:25 crc kubenswrapper[4773]: I1012 21:17:25.481388 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:17:25 crc kubenswrapper[4773]: E1012 21:17:25.482195 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:17:36 crc kubenswrapper[4773]: I1012 21:17:36.481049 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:17:36 crc kubenswrapper[4773]: E1012 21:17:36.481993 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:17:50 crc kubenswrapper[4773]: I1012 21:17:50.481465 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:17:50 crc kubenswrapper[4773]: E1012 21:17:50.482340 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:18:05 crc kubenswrapper[4773]: I1012 21:18:05.481555 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:18:06 crc kubenswrapper[4773]: I1012 21:18:06.369941 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"b08f176c669e3d17182f07bb2f773fddf9422828820154afb12dfceb24b5defd"} Oct 12 21:18:32 crc kubenswrapper[4773]: I1012 21:18:32.622215 4773 generic.go:334] "Generic (PLEG): container finished" podID="f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" containerID="6e8b61c72edb04c057b0cb2f535d0409ecd301f5b1272e7b78126fc97a5ce9b2" exitCode=0 Oct 12 21:18:32 crc kubenswrapper[4773]: I1012 21:18:32.622333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" event={"ID":"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac","Type":"ContainerDied","Data":"6e8b61c72edb04c057b0cb2f535d0409ecd301f5b1272e7b78126fc97a5ce9b2"} Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.083816 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.221596 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-inventory\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.221636 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-0\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.221709 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph-nova-0\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.221801 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-0\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.221836 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-1\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.221889 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ssh-key\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.221939 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-extra-config-0\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.221977 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-1\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.222035 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.222053 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tgrg\" (UniqueName: \"kubernetes.io/projected/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-kube-api-access-7tgrg\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.222071 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-custom-ceph-combined-ca-bundle\") pod \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\" (UID: \"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac\") " Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.238999 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.239017 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph" (OuterVolumeSpecName: "ceph") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.245577 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.252328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-kube-api-access-7tgrg" (OuterVolumeSpecName: "kube-api-access-7tgrg") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "kube-api-access-7tgrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.254915 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.257879 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.260663 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-inventory" (OuterVolumeSpecName: "inventory") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.272399 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.273295 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.276328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.279925 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" (UID: "f216a3f9-57a7-4084-b8c1-ed07cd69d4ac"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324213 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324243 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tgrg\" (UniqueName: \"kubernetes.io/projected/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-kube-api-access-7tgrg\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324255 4773 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324266 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324276 4773 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324285 4773 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324293 4773 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324301 4773 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324308 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324316 4773 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.324324 4773 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f216a3f9-57a7-4084-b8c1-ed07cd69d4ac-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.640280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" event={"ID":"f216a3f9-57a7-4084-b8c1-ed07cd69d4ac","Type":"ContainerDied","Data":"550bbae69a55d7a19c187ca43370024cf64b700718e52b3fc81db89cb35c2e0d"} Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.640545 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="550bbae69a55d7a19c187ca43370024cf64b700718e52b3fc81db89cb35c2e0d" Oct 12 21:18:34 crc kubenswrapper[4773]: I1012 21:18:34.640333 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.490244 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 12 21:18:48 crc kubenswrapper[4773]: E1012 21:18:48.490895 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerName="extract-utilities" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.490906 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerName="extract-utilities" Oct 12 21:18:48 crc kubenswrapper[4773]: E1012 21:18:48.490914 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.490921 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 12 21:18:48 crc kubenswrapper[4773]: E1012 21:18:48.490949 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerName="extract-content" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.490955 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerName="extract-content" Oct 12 21:18:48 crc kubenswrapper[4773]: E1012 21:18:48.490976 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerName="registry-server" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.490982 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerName="registry-server" Oct 12 21:18:48 crc kubenswrapper[4773]: E1012 21:18:48.490994 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3fcd410-40c1-4974-a418-83a1e46edd10" containerName="collect-profiles" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.491000 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3fcd410-40c1-4974-a418-83a1e46edd10" containerName="collect-profiles" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.491161 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f216a3f9-57a7-4084-b8c1-ed07cd69d4ac" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.491175 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3fcd410-40c1-4974-a418-83a1e46edd10" containerName="collect-profiles" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.491192 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d016e5f-ad13-44c7-ba2c-328bff3a7a96" containerName="registry-server" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.492052 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.495999 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.496025 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.506400 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.550956 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.552376 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.554044 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.574676 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.589951 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-run\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590044 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590072 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590197 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590239 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7wt\" (UniqueName: \"kubernetes.io/projected/c3963233-e9ff-4f92-a94a-5b99835ab607-kube-api-access-jn7wt\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590271 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590292 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590474 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590499 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590527 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590558 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590578 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590594 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590615 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c3963233-e9ff-4f92-a94a-5b99835ab607-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.590630 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.692663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.692988 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693138 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693224 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-config-data\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693318 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693444 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693525 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693667 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693900 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.693976 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694108 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694055 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694009 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694120 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-run\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694274 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694316 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c3963233-e9ff-4f92-a94a-5b99835ab607-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694363 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694527 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-dev\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694775 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694775 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4zt\" (UniqueName: \"kubernetes.io/projected/716b9576-d48b-4720-9fb4-73f6744adee5-kube-api-access-ss4zt\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694823 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-run\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694842 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-sys\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-lib-modules\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694898 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.694899 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-run\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695033 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695097 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/716b9576-d48b-4720-9fb4-73f6744adee5-ceph\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695130 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695149 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695239 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7wt\" (UniqueName: \"kubernetes.io/projected/c3963233-e9ff-4f92-a94a-5b99835ab607-kube-api-access-jn7wt\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695306 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-scripts\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695456 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695607 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695646 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.695665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3963233-e9ff-4f92-a94a-5b99835ab607-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.699500 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.699622 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.700555 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.703400 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c3963233-e9ff-4f92-a94a-5b99835ab607-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.704362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3963233-e9ff-4f92-a94a-5b99835ab607-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.712654 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7wt\" (UniqueName: \"kubernetes.io/projected/c3963233-e9ff-4f92-a94a-5b99835ab607-kube-api-access-jn7wt\") pod \"cinder-volume-volume1-0\" (UID: \"c3963233-e9ff-4f92-a94a-5b99835ab607\") " pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.796483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-dev\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.797055 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss4zt\" (UniqueName: \"kubernetes.io/projected/716b9576-d48b-4720-9fb4-73f6744adee5-kube-api-access-ss4zt\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.797432 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-sys\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.797589 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-lib-modules\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.797699 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.797811 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/716b9576-d48b-4720-9fb4-73f6744adee5-ceph\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.797886 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.797980 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-scripts\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798211 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-config-data\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798297 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798386 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798556 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-run\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798644 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798787 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.796622 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-dev\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.798949 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-lib-modules\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.797558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-sys\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.799081 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.800321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.800382 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-run\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.800582 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.800731 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.801219 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/716b9576-d48b-4720-9fb4-73f6744adee5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.803152 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.803863 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-scripts\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.804056 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/716b9576-d48b-4720-9fb4-73f6744adee5-ceph\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.804429 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-config-data\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.805326 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716b9576-d48b-4720-9fb4-73f6744adee5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.807135 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.815912 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss4zt\" (UniqueName: \"kubernetes.io/projected/716b9576-d48b-4720-9fb4-73f6744adee5-kube-api-access-ss4zt\") pod \"cinder-backup-0\" (UID: \"716b9576-d48b-4720-9fb4-73f6744adee5\") " pod="openstack/cinder-backup-0" Oct 12 21:18:48 crc kubenswrapper[4773]: I1012 21:18:48.870271 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.182476 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-2bsmz"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.183611 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-2bsmz" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.209151 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnkg5\" (UniqueName: \"kubernetes.io/projected/d3c5eb22-77dc-4904-b52e-9c40d9d488e2-kube-api-access-rnkg5\") pod \"manila-db-create-2bsmz\" (UID: \"d3c5eb22-77dc-4904-b52e-9c40d9d488e2\") " pod="openstack/manila-db-create-2bsmz" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.213947 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-2bsmz"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.310817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnkg5\" (UniqueName: \"kubernetes.io/projected/d3c5eb22-77dc-4904-b52e-9c40d9d488e2-kube-api-access-rnkg5\") pod \"manila-db-create-2bsmz\" (UID: \"d3c5eb22-77dc-4904-b52e-9c40d9d488e2\") " pod="openstack/manila-db-create-2bsmz" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.338436 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnkg5\" (UniqueName: \"kubernetes.io/projected/d3c5eb22-77dc-4904-b52e-9c40d9d488e2-kube-api-access-rnkg5\") pod \"manila-db-create-2bsmz\" (UID: \"d3c5eb22-77dc-4904-b52e-9c40d9d488e2\") " pod="openstack/manila-db-create-2bsmz" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.358497 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b7789fcb9-fpqth"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.360206 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.366191 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.366536 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-9nnhw" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.366678 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.366825 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.382854 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b7789fcb9-fpqth"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.426776 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.428348 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.431971 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.432228 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fb9l2" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.432355 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.432390 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.448252 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.494200 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.495784 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.498966 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.499103 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.522363 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-2bsmz" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.526214 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vms52\" (UniqueName: \"kubernetes.io/projected/e623e8d3-599d-4637-97ae-b94024c87ad6-kube-api-access-vms52\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.526333 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-scripts\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.526375 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623e8d3-599d-4637-97ae-b94024c87ad6-logs\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.526448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e623e8d3-599d-4637-97ae-b94024c87ad6-horizon-secret-key\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.526602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-config-data\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.629659 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.630130 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.630223 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-logs\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631013 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631123 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-ceph\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhf6h\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-kube-api-access-bhf6h\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-config-data\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631368 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-scripts\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vms52\" (UniqueName: \"kubernetes.io/projected/e623e8d3-599d-4637-97ae-b94024c87ad6-kube-api-access-vms52\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631534 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631598 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-logs\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631816 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.631925 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.632012 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-scripts\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.632102 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623e8d3-599d-4637-97ae-b94024c87ad6-logs\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.632193 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7wj9\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-kube-api-access-m7wj9\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.632285 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.632372 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.632441 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.632528 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.632610 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.632992 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-config-data\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.633038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-config-data\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.633086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e623e8d3-599d-4637-97ae-b94024c87ad6-horizon-secret-key\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.634867 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-scripts\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.635415 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623e8d3-599d-4637-97ae-b94024c87ad6-logs\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.637972 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.645439 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e623e8d3-599d-4637-97ae-b94024c87ad6-horizon-secret-key\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.667676 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vms52\" (UniqueName: \"kubernetes.io/projected/e623e8d3-599d-4637-97ae-b94024c87ad6-kube-api-access-vms52\") pod \"horizon-6b7789fcb9-fpqth\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.669248 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.679453 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.691865 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:49 crc kubenswrapper[4773]: E1012 21:18:49.692674 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-m7wj9 logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="4ae4c46b-e8a8-47e6-8e07-f6a571c55994" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.704817 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bdbfb487c-6x9rh"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.706579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.717022 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bdbfb487c-6x9rh"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734464 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7wj9\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-kube-api-access-m7wj9\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734509 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734531 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734549 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734608 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734625 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-config-data\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734690 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734745 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.734769 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-logs\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.735322 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.735433 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.736535 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.736727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.736769 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-ceph\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.736813 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhf6h\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-kube-api-access-bhf6h\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.736848 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-scripts\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.736915 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.736935 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-logs\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.736948 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.736965 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.744070 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-logs\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.745379 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-config-data\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.745402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.748483 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.748530 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.748869 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-logs\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.749365 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.749601 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.761586 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.762528 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-ceph\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.767361 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-scripts\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.767642 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.773605 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.776080 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhf6h\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-kube-api-access-bhf6h\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.790218 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.805986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c3963233-e9ff-4f92-a94a-5b99835ab607","Type":"ContainerStarted","Data":"ac4d005644bb0874b8b9568aadc39540cee372d0763b03c8c763352efcddfead"} Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.806829 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.807188 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"716b9576-d48b-4720-9fb4-73f6744adee5","Type":"ContainerStarted","Data":"e5f988eb46a357bf781518cc1f8c81c17890feb120254c2eb93a950639a3725d"} Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.812703 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7wj9\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-kube-api-access-m7wj9\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.819471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.845697 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mr9c\" (UniqueName: \"kubernetes.io/projected/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-kube-api-access-5mr9c\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.847539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-scripts\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.847629 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-horizon-secret-key\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.847709 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-config-data\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.847755 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-logs\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.857495 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.873048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.953655 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-config-data\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.954997 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-config-data\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.953709 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-logs\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.955155 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mr9c\" (UniqueName: \"kubernetes.io/projected/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-kube-api-access-5mr9c\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.955230 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-scripts\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.955287 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-horizon-secret-key\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.956446 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-logs\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.957068 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-scripts\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.986942 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-horizon-secret-key\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.989962 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mr9c\" (UniqueName: \"kubernetes.io/projected/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-kube-api-access-5mr9c\") pod \"horizon-7bdbfb487c-6x9rh\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:49 crc kubenswrapper[4773]: I1012 21:18:49.993591 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.055941 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-2bsmz"] Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.159301 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.159363 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-public-tls-certs\") pod \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.159414 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-config-data\") pod \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.159477 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7wj9\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-kube-api-access-m7wj9\") pod \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.159507 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-httpd-run\") pod \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.159540 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-ceph\") pod \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.159611 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-scripts\") pod \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.159636 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-logs\") pod \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.159665 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-combined-ca-bundle\") pod \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\" (UID: \"4ae4c46b-e8a8-47e6-8e07-f6a571c55994\") " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.163124 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4ae4c46b-e8a8-47e6-8e07-f6a571c55994" (UID: "4ae4c46b-e8a8-47e6-8e07-f6a571c55994"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.167649 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-logs" (OuterVolumeSpecName: "logs") pod "4ae4c46b-e8a8-47e6-8e07-f6a571c55994" (UID: "4ae4c46b-e8a8-47e6-8e07-f6a571c55994"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.168030 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4ae4c46b-e8a8-47e6-8e07-f6a571c55994" (UID: "4ae4c46b-e8a8-47e6-8e07-f6a571c55994"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.168879 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-kube-api-access-m7wj9" (OuterVolumeSpecName: "kube-api-access-m7wj9") pod "4ae4c46b-e8a8-47e6-8e07-f6a571c55994" (UID: "4ae4c46b-e8a8-47e6-8e07-f6a571c55994"). InnerVolumeSpecName "kube-api-access-m7wj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.168941 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ae4c46b-e8a8-47e6-8e07-f6a571c55994" (UID: "4ae4c46b-e8a8-47e6-8e07-f6a571c55994"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.169036 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-ceph" (OuterVolumeSpecName: "ceph") pod "4ae4c46b-e8a8-47e6-8e07-f6a571c55994" (UID: "4ae4c46b-e8a8-47e6-8e07-f6a571c55994"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.173002 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "4ae4c46b-e8a8-47e6-8e07-f6a571c55994" (UID: "4ae4c46b-e8a8-47e6-8e07-f6a571c55994"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.173033 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-scripts" (OuterVolumeSpecName: "scripts") pod "4ae4c46b-e8a8-47e6-8e07-f6a571c55994" (UID: "4ae4c46b-e8a8-47e6-8e07-f6a571c55994"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.174447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-config-data" (OuterVolumeSpecName: "config-data") pod "4ae4c46b-e8a8-47e6-8e07-f6a571c55994" (UID: "4ae4c46b-e8a8-47e6-8e07-f6a571c55994"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.175378 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.209100 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b7789fcb9-fpqth"] Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.258912 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.264388 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.264414 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-logs\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.264423 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.264463 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.264478 4773 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.264488 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.264498 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7wj9\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-kube-api-access-m7wj9\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.264506 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.264516 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4ae4c46b-e8a8-47e6-8e07-f6a571c55994-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.301104 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.366982 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.818368 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bdbfb487c-6x9rh"] Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.870058 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.877491 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b7789fcb9-fpqth" event={"ID":"e623e8d3-599d-4637-97ae-b94024c87ad6","Type":"ContainerStarted","Data":"41055f4d2ff772e64364c67f79d9413eee394ae756a9a87babedcb5c72462197"} Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.879455 4773 generic.go:334] "Generic (PLEG): container finished" podID="d3c5eb22-77dc-4904-b52e-9c40d9d488e2" containerID="7d4952587b2616c873aea4d8a063eef16df9a555a757847398533a5a453eeec5" exitCode=0 Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.879541 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.879977 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-2bsmz" event={"ID":"d3c5eb22-77dc-4904-b52e-9c40d9d488e2","Type":"ContainerDied","Data":"7d4952587b2616c873aea4d8a063eef16df9a555a757847398533a5a453eeec5"} Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.880054 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-2bsmz" event={"ID":"d3c5eb22-77dc-4904-b52e-9c40d9d488e2","Type":"ContainerStarted","Data":"e26b763b03e4b171888beee4d279ac25120129e778db9a098866f66b87eafe9e"} Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.968835 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.974459 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:50 crc kubenswrapper[4773]: I1012 21:18:50.999556 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.001462 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.004322 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.008598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.009892 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:51 crc kubenswrapper[4773]: W1012 21:18:51.059918 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5354720_d7d2_4b8c_b4e6_ed0f8c546b69.slice/crio-b6cc93f7fcd2b9141a6d7aca4f5cf7a6d72a452273e4fc2187023bafdddb5690 WatchSource:0}: Error finding container b6cc93f7fcd2b9141a6d7aca4f5cf7a6d72a452273e4fc2187023bafdddb5690: Status 404 returned error can't find the container with id b6cc93f7fcd2b9141a6d7aca4f5cf7a6d72a452273e4fc2187023bafdddb5690 Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.083260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5t2w\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-kube-api-access-v5t2w\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.083602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-logs\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.083700 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.083744 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.083766 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.083790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-ceph\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.084407 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.084505 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.084533 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.191358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.191401 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.191422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.191441 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-ceph\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.191457 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.191520 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.191558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.191611 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5t2w\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-kube-api-access-v5t2w\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.191628 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-logs\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.192452 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-logs\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.192557 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.193546 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.202541 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-ceph\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.204116 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.214404 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.214831 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.215993 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.221935 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5t2w\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-kube-api-access-v5t2w\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.261943 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:51 crc kubenswrapper[4773]: I1012 21:18:51.321186 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.788330 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b7789fcb9-fpqth"] Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.814148 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-748d9fcc4-b9gp4"] Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.835307 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.839091 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.867980 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-748d9fcc4-b9gp4"] Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.918249 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-config-data\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.918507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-tls-certs\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.918574 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-logs\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.918620 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-combined-ca-bundle\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.918646 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-scripts\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.918689 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g7pf\" (UniqueName: \"kubernetes.io/projected/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-kube-api-access-5g7pf\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.918760 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-secret-key\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.943106 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:51.999546 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.028122 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-combined-ca-bundle\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.028181 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-scripts\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.028244 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g7pf\" (UniqueName: \"kubernetes.io/projected/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-kube-api-access-5g7pf\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.028307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-secret-key\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.028345 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-config-data\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.028363 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-tls-certs\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.028418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-logs\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.033000 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-logs\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.036211 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bdbfb487c-6x9rh"] Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.036419 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a5000645-c8d4-437f-8624-a2b6175d99e8","Type":"ContainerStarted","Data":"6408f64c17d31dad2052714bdee974f8daca996db274e6849de6b6814c9334ee"} Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.037899 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-scripts\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.045168 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-tls-certs\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.045340 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-secret-key\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.059047 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-config-data\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.062189 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-combined-ca-bundle\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.065980 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bdbfb487c-6x9rh" event={"ID":"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69","Type":"ContainerStarted","Data":"b6cc93f7fcd2b9141a6d7aca4f5cf7a6d72a452273e4fc2187023bafdddb5690"} Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.072500 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g7pf\" (UniqueName: \"kubernetes.io/projected/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-kube-api-access-5g7pf\") pod \"horizon-748d9fcc4-b9gp4\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.074643 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f5486cbb4-g66c2"] Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.079661 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.094008 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f5486cbb4-g66c2"] Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.099378 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c3963233-e9ff-4f92-a94a-5b99835ab607","Type":"ContainerStarted","Data":"e2f555f7bdd8a789bd97e0908cbc5efea4617b9cb9118871421614c42c2cdd87"} Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.102697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"716b9576-d48b-4720-9fb4-73f6744adee5","Type":"ContainerStarted","Data":"4b8cdd10c89800d8dc5c4683daefa60bddf19ab50853297820360e4e8eda82a6"} Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.206751 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.236230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f878004-f437-4db3-a695-09d92a0bc6e4-scripts\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.236268 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f878004-f437-4db3-a695-09d92a0bc6e4-horizon-secret-key\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.236318 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f878004-f437-4db3-a695-09d92a0bc6e4-config-data\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.236364 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f878004-f437-4db3-a695-09d92a0bc6e4-logs\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.236406 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f878004-f437-4db3-a695-09d92a0bc6e4-combined-ca-bundle\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.236432 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z2jd\" (UniqueName: \"kubernetes.io/projected/7f878004-f437-4db3-a695-09d92a0bc6e4-kube-api-access-6z2jd\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.236465 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f878004-f437-4db3-a695-09d92a0bc6e4-horizon-tls-certs\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.340763 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f878004-f437-4db3-a695-09d92a0bc6e4-scripts\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.341030 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f878004-f437-4db3-a695-09d92a0bc6e4-horizon-secret-key\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.341090 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f878004-f437-4db3-a695-09d92a0bc6e4-config-data\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.341137 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f878004-f437-4db3-a695-09d92a0bc6e4-logs\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.341179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f878004-f437-4db3-a695-09d92a0bc6e4-combined-ca-bundle\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.341213 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z2jd\" (UniqueName: \"kubernetes.io/projected/7f878004-f437-4db3-a695-09d92a0bc6e4-kube-api-access-6z2jd\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.341248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f878004-f437-4db3-a695-09d92a0bc6e4-horizon-tls-certs\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.342020 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f878004-f437-4db3-a695-09d92a0bc6e4-logs\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.342145 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f878004-f437-4db3-a695-09d92a0bc6e4-scripts\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.343386 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f878004-f437-4db3-a695-09d92a0bc6e4-config-data\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.355370 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f878004-f437-4db3-a695-09d92a0bc6e4-horizon-secret-key\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.355817 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f878004-f437-4db3-a695-09d92a0bc6e4-combined-ca-bundle\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.359099 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f878004-f437-4db3-a695-09d92a0bc6e4-horizon-tls-certs\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.360174 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z2jd\" (UniqueName: \"kubernetes.io/projected/7f878004-f437-4db3-a695-09d92a0bc6e4-kube-api-access-6z2jd\") pod \"horizon-6f5486cbb4-g66c2\" (UID: \"7f878004-f437-4db3-a695-09d92a0bc6e4\") " pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.494606 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae4c46b-e8a8-47e6-8e07-f6a571c55994" path="/var/lib/kubelet/pods/4ae4c46b-e8a8-47e6-8e07-f6a571c55994/volumes" Oct 12 21:18:52 crc kubenswrapper[4773]: I1012 21:18:52.629229 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.022035 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-2bsmz" Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.111041 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-748d9fcc4-b9gp4"] Oct 12 21:18:53 crc kubenswrapper[4773]: W1012 21:18:53.117050 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdf116e1_a9e0_44e1_bd21_bcbfe2c365eb.slice/crio-56d0d31d9f111798d6fed36aa0b8ee8278251563cd1c9a1ade5d6fbf0b9db604 WatchSource:0}: Error finding container 56d0d31d9f111798d6fed36aa0b8ee8278251563cd1c9a1ade5d6fbf0b9db604: Status 404 returned error can't find the container with id 56d0d31d9f111798d6fed36aa0b8ee8278251563cd1c9a1ade5d6fbf0b9db604 Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.173412 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a5000645-c8d4-437f-8624-a2b6175d99e8","Type":"ContainerStarted","Data":"bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02"} Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.174707 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnkg5\" (UniqueName: \"kubernetes.io/projected/d3c5eb22-77dc-4904-b52e-9c40d9d488e2-kube-api-access-rnkg5\") pod \"d3c5eb22-77dc-4904-b52e-9c40d9d488e2\" (UID: \"d3c5eb22-77dc-4904-b52e-9c40d9d488e2\") " Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.178940 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-2bsmz" event={"ID":"d3c5eb22-77dc-4904-b52e-9c40d9d488e2","Type":"ContainerDied","Data":"e26b763b03e4b171888beee4d279ac25120129e778db9a098866f66b87eafe9e"} Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.178977 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e26b763b03e4b171888beee4d279ac25120129e778db9a098866f66b87eafe9e" Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.179055 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-2bsmz" Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.185658 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c3963233-e9ff-4f92-a94a-5b99835ab607","Type":"ContainerStarted","Data":"46b58fc8d8e777f3e444af5586fa6510c250c56ad180caed25cff8d6a0c2dcb3"} Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.220778 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c5eb22-77dc-4904-b52e-9c40d9d488e2-kube-api-access-rnkg5" (OuterVolumeSpecName: "kube-api-access-rnkg5") pod "d3c5eb22-77dc-4904-b52e-9c40d9d488e2" (UID: "d3c5eb22-77dc-4904-b52e-9c40d9d488e2"). InnerVolumeSpecName "kube-api-access-rnkg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.238818 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"716b9576-d48b-4720-9fb4-73f6744adee5","Type":"ContainerStarted","Data":"d2fa1ae78bb6f1e4baaef9accbaa89c9469ce57d70f3b888ecbfc1bd9235628a"} Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.278171 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnkg5\" (UniqueName: \"kubernetes.io/projected/d3c5eb22-77dc-4904-b52e-9c40d9d488e2-kube-api-access-rnkg5\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.340592 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.879663205 podStartE2EDuration="5.340574667s" podCreationTimestamp="2025-10-12 21:18:48 +0000 UTC" firstStartedPulling="2025-10-12 21:18:49.669015388 +0000 UTC m=+3277.905313948" lastFinishedPulling="2025-10-12 21:18:51.12992685 +0000 UTC m=+3279.366225410" observedRunningTime="2025-10-12 21:18:53.270576573 +0000 UTC m=+3281.506875133" watchObservedRunningTime="2025-10-12 21:18:53.340574667 +0000 UTC m=+3281.576873227" Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.342247 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.929476897 podStartE2EDuration="5.342240943s" podCreationTimestamp="2025-10-12 21:18:48 +0000 UTC" firstStartedPulling="2025-10-12 21:18:49.713308969 +0000 UTC m=+3277.949607529" lastFinishedPulling="2025-10-12 21:18:51.126073015 +0000 UTC m=+3279.362371575" observedRunningTime="2025-10-12 21:18:53.333343619 +0000 UTC m=+3281.569642179" watchObservedRunningTime="2025-10-12 21:18:53.342240943 +0000 UTC m=+3281.578539503" Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.355637 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.491284 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f5486cbb4-g66c2"] Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.807816 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:53 crc kubenswrapper[4773]: I1012 21:18:53.871100 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 12 21:18:54 crc kubenswrapper[4773]: I1012 21:18:54.252917 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-748d9fcc4-b9gp4" event={"ID":"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb","Type":"ContainerStarted","Data":"56d0d31d9f111798d6fed36aa0b8ee8278251563cd1c9a1ade5d6fbf0b9db604"} Oct 12 21:18:54 crc kubenswrapper[4773]: I1012 21:18:54.255802 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5486cbb4-g66c2" event={"ID":"7f878004-f437-4db3-a695-09d92a0bc6e4","Type":"ContainerStarted","Data":"79f852488cccbbad46c9bc4e448d29e919678ff639b55f3c1dfc142d2c808a97"} Oct 12 21:18:54 crc kubenswrapper[4773]: I1012 21:18:54.259667 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d8f97dc-0f26-42c1-af15-67632a442b68","Type":"ContainerStarted","Data":"ff53895c3c41e27f2153055ecc58f3a86adca43a1110f7f5fccb560823d5e9d7"} Oct 12 21:18:54 crc kubenswrapper[4773]: I1012 21:18:54.261956 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a5000645-c8d4-437f-8624-a2b6175d99e8","Type":"ContainerStarted","Data":"ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d"} Oct 12 21:18:54 crc kubenswrapper[4773]: I1012 21:18:54.262056 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerName="glance-log" containerID="cri-o://bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02" gracePeriod=30 Oct 12 21:18:54 crc kubenswrapper[4773]: I1012 21:18:54.262178 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerName="glance-httpd" containerID="cri-o://ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d" gracePeriod=30 Oct 12 21:18:54 crc kubenswrapper[4773]: I1012 21:18:54.289902 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.289882598 podStartE2EDuration="5.289882598s" podCreationTimestamp="2025-10-12 21:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:18:54.282729142 +0000 UTC m=+3282.519027702" watchObservedRunningTime="2025-10-12 21:18:54.289882598 +0000 UTC m=+3282.526181158" Oct 12 21:18:54 crc kubenswrapper[4773]: I1012 21:18:54.992992 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.030234 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-config-data\") pod \"a5000645-c8d4-437f-8624-a2b6175d99e8\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.030276 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhf6h\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-kube-api-access-bhf6h\") pod \"a5000645-c8d4-437f-8624-a2b6175d99e8\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.030305 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-ceph\") pod \"a5000645-c8d4-437f-8624-a2b6175d99e8\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.030330 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-httpd-run\") pod \"a5000645-c8d4-437f-8624-a2b6175d99e8\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.030390 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-scripts\") pod \"a5000645-c8d4-437f-8624-a2b6175d99e8\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.030456 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"a5000645-c8d4-437f-8624-a2b6175d99e8\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.030525 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-logs\") pod \"a5000645-c8d4-437f-8624-a2b6175d99e8\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.030543 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-combined-ca-bundle\") pod \"a5000645-c8d4-437f-8624-a2b6175d99e8\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.030605 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-internal-tls-certs\") pod \"a5000645-c8d4-437f-8624-a2b6175d99e8\" (UID: \"a5000645-c8d4-437f-8624-a2b6175d99e8\") " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.031531 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a5000645-c8d4-437f-8624-a2b6175d99e8" (UID: "a5000645-c8d4-437f-8624-a2b6175d99e8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.032864 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-logs" (OuterVolumeSpecName: "logs") pod "a5000645-c8d4-437f-8624-a2b6175d99e8" (UID: "a5000645-c8d4-437f-8624-a2b6175d99e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.041825 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-scripts" (OuterVolumeSpecName: "scripts") pod "a5000645-c8d4-437f-8624-a2b6175d99e8" (UID: "a5000645-c8d4-437f-8624-a2b6175d99e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.041996 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-ceph" (OuterVolumeSpecName: "ceph") pod "a5000645-c8d4-437f-8624-a2b6175d99e8" (UID: "a5000645-c8d4-437f-8624-a2b6175d99e8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.054238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-kube-api-access-bhf6h" (OuterVolumeSpecName: "kube-api-access-bhf6h") pod "a5000645-c8d4-437f-8624-a2b6175d99e8" (UID: "a5000645-c8d4-437f-8624-a2b6175d99e8"). InnerVolumeSpecName "kube-api-access-bhf6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.062007 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "a5000645-c8d4-437f-8624-a2b6175d99e8" (UID: "a5000645-c8d4-437f-8624-a2b6175d99e8"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.073852 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5000645-c8d4-437f-8624-a2b6175d99e8" (UID: "a5000645-c8d4-437f-8624-a2b6175d99e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.133804 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhf6h\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-kube-api-access-bhf6h\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.133954 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a5000645-c8d4-437f-8624-a2b6175d99e8-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.134024 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.134078 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.134146 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.134196 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5000645-c8d4-437f-8624-a2b6175d99e8-logs\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.134314 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.163909 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-config-data" (OuterVolumeSpecName: "config-data") pod "a5000645-c8d4-437f-8624-a2b6175d99e8" (UID: "a5000645-c8d4-437f-8624-a2b6175d99e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.168424 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.171447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a5000645-c8d4-437f-8624-a2b6175d99e8" (UID: "a5000645-c8d4-437f-8624-a2b6175d99e8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.236572 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.236598 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.236607 4773 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5000645-c8d4-437f-8624-a2b6175d99e8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.283116 4773 generic.go:334] "Generic (PLEG): container finished" podID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerID="ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d" exitCode=143 Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.283149 4773 generic.go:334] "Generic (PLEG): container finished" podID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerID="bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02" exitCode=143 Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.283187 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a5000645-c8d4-437f-8624-a2b6175d99e8","Type":"ContainerDied","Data":"ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d"} Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.283211 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a5000645-c8d4-437f-8624-a2b6175d99e8","Type":"ContainerDied","Data":"bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02"} Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.283219 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a5000645-c8d4-437f-8624-a2b6175d99e8","Type":"ContainerDied","Data":"6408f64c17d31dad2052714bdee974f8daca996db274e6849de6b6814c9334ee"} Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.283234 4773 scope.go:117] "RemoveContainer" containerID="ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.283348 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.287332 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d8f97dc-0f26-42c1-af15-67632a442b68","Type":"ContainerStarted","Data":"1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b"} Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.287478 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerName="glance-log" containerID="cri-o://1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b" gracePeriod=30 Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.287583 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerName="glance-httpd" containerID="cri-o://740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec" gracePeriod=30 Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.313342 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.313322137 podStartE2EDuration="5.313322137s" podCreationTimestamp="2025-10-12 21:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:18:55.310999284 +0000 UTC m=+3283.547297844" watchObservedRunningTime="2025-10-12 21:18:55.313322137 +0000 UTC m=+3283.549620687" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.358907 4773 scope.go:117] "RemoveContainer" containerID="bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.368882 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.383488 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.397107 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 21:18:55 crc kubenswrapper[4773]: E1012 21:18:55.397800 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5eb22-77dc-4904-b52e-9c40d9d488e2" containerName="mariadb-database-create" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.397816 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5eb22-77dc-4904-b52e-9c40d9d488e2" containerName="mariadb-database-create" Oct 12 21:18:55 crc kubenswrapper[4773]: E1012 21:18:55.397848 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerName="glance-httpd" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.397858 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerName="glance-httpd" Oct 12 21:18:55 crc kubenswrapper[4773]: E1012 21:18:55.397876 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerName="glance-log" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.397882 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerName="glance-log" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.398106 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c5eb22-77dc-4904-b52e-9c40d9d488e2" containerName="mariadb-database-create" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.398127 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerName="glance-log" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.398146 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5000645-c8d4-437f-8624-a2b6175d99e8" containerName="glance-httpd" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.399173 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.403669 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.403813 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.423527 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441079 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wts8l\" (UniqueName: \"kubernetes.io/projected/1e607bc3-d77b-4dfb-a697-911f6dea3244-kube-api-access-wts8l\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441146 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441235 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441263 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e607bc3-d77b-4dfb-a697-911f6dea3244-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441311 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e607bc3-d77b-4dfb-a697-911f6dea3244-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441325 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441433 4773 scope.go:117] "RemoveContainer" containerID="ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441449 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e607bc3-d77b-4dfb-a697-911f6dea3244-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441470 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.441496 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: E1012 21:18:55.443238 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d\": container with ID starting with ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d not found: ID does not exist" containerID="ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.443329 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d"} err="failed to get container status \"ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d\": rpc error: code = NotFound desc = could not find container \"ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d\": container with ID starting with ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d not found: ID does not exist" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.443414 4773 scope.go:117] "RemoveContainer" containerID="bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02" Oct 12 21:18:55 crc kubenswrapper[4773]: E1012 21:18:55.444040 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02\": container with ID starting with bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02 not found: ID does not exist" containerID="bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.444192 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02"} err="failed to get container status \"bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02\": rpc error: code = NotFound desc = could not find container \"bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02\": container with ID starting with bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02 not found: ID does not exist" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.444252 4773 scope.go:117] "RemoveContainer" containerID="ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.445059 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d"} err="failed to get container status \"ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d\": rpc error: code = NotFound desc = could not find container \"ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d\": container with ID starting with ff212da31aa2121552b6b6b3d88a9945261de541844a4da90f1a1da5543ac20d not found: ID does not exist" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.445147 4773 scope.go:117] "RemoveContainer" containerID="bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.445515 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02"} err="failed to get container status \"bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02\": rpc error: code = NotFound desc = could not find container \"bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02\": container with ID starting with bba26a799bb374729fdba9bfceb8f3c624fef3d095021b6b6471beba4229dc02 not found: ID does not exist" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.542909 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.542954 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e607bc3-d77b-4dfb-a697-911f6dea3244-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.543004 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e607bc3-d77b-4dfb-a697-911f6dea3244-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.543022 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.543146 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e607bc3-d77b-4dfb-a697-911f6dea3244-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.543163 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.543182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.543265 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wts8l\" (UniqueName: \"kubernetes.io/projected/1e607bc3-d77b-4dfb-a697-911f6dea3244-kube-api-access-wts8l\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.543292 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.545650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e607bc3-d77b-4dfb-a697-911f6dea3244-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.545947 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e607bc3-d77b-4dfb-a697-911f6dea3244-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.546514 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.556005 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.557337 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e607bc3-d77b-4dfb-a697-911f6dea3244-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.560646 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.561916 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.595364 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wts8l\" (UniqueName: \"kubernetes.io/projected/1e607bc3-d77b-4dfb-a697-911f6dea3244-kube-api-access-wts8l\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.595909 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e607bc3-d77b-4dfb-a697-911f6dea3244-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.670813 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e607bc3-d77b-4dfb-a697-911f6dea3244\") " pod="openstack/glance-default-internal-api-0" Oct 12 21:18:55 crc kubenswrapper[4773]: I1012 21:18:55.744009 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.051852 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.077311 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-config-data\") pod \"4d8f97dc-0f26-42c1-af15-67632a442b68\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.077379 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-combined-ca-bundle\") pod \"4d8f97dc-0f26-42c1-af15-67632a442b68\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.077431 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4d8f97dc-0f26-42c1-af15-67632a442b68\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.077493 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5t2w\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-kube-api-access-v5t2w\") pod \"4d8f97dc-0f26-42c1-af15-67632a442b68\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.077555 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-httpd-run\") pod \"4d8f97dc-0f26-42c1-af15-67632a442b68\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.077643 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-scripts\") pod \"4d8f97dc-0f26-42c1-af15-67632a442b68\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.077665 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-ceph\") pod \"4d8f97dc-0f26-42c1-af15-67632a442b68\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.077694 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-logs\") pod \"4d8f97dc-0f26-42c1-af15-67632a442b68\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.077738 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-public-tls-certs\") pod \"4d8f97dc-0f26-42c1-af15-67632a442b68\" (UID: \"4d8f97dc-0f26-42c1-af15-67632a442b68\") " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.079899 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-logs" (OuterVolumeSpecName: "logs") pod "4d8f97dc-0f26-42c1-af15-67632a442b68" (UID: "4d8f97dc-0f26-42c1-af15-67632a442b68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.081541 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4d8f97dc-0f26-42c1-af15-67632a442b68" (UID: "4d8f97dc-0f26-42c1-af15-67632a442b68"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.095411 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-scripts" (OuterVolumeSpecName: "scripts") pod "4d8f97dc-0f26-42c1-af15-67632a442b68" (UID: "4d8f97dc-0f26-42c1-af15-67632a442b68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.097942 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "4d8f97dc-0f26-42c1-af15-67632a442b68" (UID: "4d8f97dc-0f26-42c1-af15-67632a442b68"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.098201 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-kube-api-access-v5t2w" (OuterVolumeSpecName: "kube-api-access-v5t2w") pod "4d8f97dc-0f26-42c1-af15-67632a442b68" (UID: "4d8f97dc-0f26-42c1-af15-67632a442b68"). InnerVolumeSpecName "kube-api-access-v5t2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.103213 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-ceph" (OuterVolumeSpecName: "ceph") pod "4d8f97dc-0f26-42c1-af15-67632a442b68" (UID: "4d8f97dc-0f26-42c1-af15-67632a442b68"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.137123 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d8f97dc-0f26-42c1-af15-67632a442b68" (UID: "4d8f97dc-0f26-42c1-af15-67632a442b68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.163394 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-config-data" (OuterVolumeSpecName: "config-data") pod "4d8f97dc-0f26-42c1-af15-67632a442b68" (UID: "4d8f97dc-0f26-42c1-af15-67632a442b68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.185133 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.185497 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.185542 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.185556 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5t2w\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-kube-api-access-v5t2w\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.185568 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.185579 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.185590 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d8f97dc-0f26-42c1-af15-67632a442b68-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.185601 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d8f97dc-0f26-42c1-af15-67632a442b68-logs\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.195109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4d8f97dc-0f26-42c1-af15-67632a442b68" (UID: "4d8f97dc-0f26-42c1-af15-67632a442b68"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.214080 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.287759 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.287788 4773 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d8f97dc-0f26-42c1-af15-67632a442b68-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.301067 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerID="1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b" exitCode=143 Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.301094 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerID="740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec" exitCode=143 Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.301140 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d8f97dc-0f26-42c1-af15-67632a442b68","Type":"ContainerDied","Data":"1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b"} Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.301167 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d8f97dc-0f26-42c1-af15-67632a442b68","Type":"ContainerDied","Data":"ff53895c3c41e27f2153055ecc58f3a86adca43a1110f7f5fccb560823d5e9d7"} Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.301179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d8f97dc-0f26-42c1-af15-67632a442b68","Type":"ContainerDied","Data":"740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec"} Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.301194 4773 scope.go:117] "RemoveContainer" containerID="740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.301261 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.331097 4773 scope.go:117] "RemoveContainer" containerID="1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.339250 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.349048 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.383147 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:56 crc kubenswrapper[4773]: E1012 21:18:56.383583 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerName="glance-log" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.383595 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerName="glance-log" Oct 12 21:18:56 crc kubenswrapper[4773]: E1012 21:18:56.383605 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerName="glance-httpd" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.383613 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerName="glance-httpd" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.383912 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerName="glance-httpd" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.383929 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8f97dc-0f26-42c1-af15-67632a442b68" containerName="glance-log" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.384960 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.389094 4773 scope.go:117] "RemoveContainer" containerID="740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.389121 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.389370 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.391183 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:56 crc kubenswrapper[4773]: E1012 21:18:56.406028 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec\": container with ID starting with 740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec not found: ID does not exist" containerID="740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.406079 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec"} err="failed to get container status \"740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec\": rpc error: code = NotFound desc = could not find container \"740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec\": container with ID starting with 740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec not found: ID does not exist" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.406280 4773 scope.go:117] "RemoveContainer" containerID="1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b" Oct 12 21:18:56 crc kubenswrapper[4773]: E1012 21:18:56.408663 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b\": container with ID starting with 1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b not found: ID does not exist" containerID="1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.408703 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b"} err="failed to get container status \"1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b\": rpc error: code = NotFound desc = could not find container \"1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b\": container with ID starting with 1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b not found: ID does not exist" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.408767 4773 scope.go:117] "RemoveContainer" containerID="740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.409506 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec"} err="failed to get container status \"740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec\": rpc error: code = NotFound desc = could not find container \"740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec\": container with ID starting with 740e2274caa32aa04aaf4b7374641fd3dba58a0bd2c08af9c9c62fe9e8d446ec not found: ID does not exist" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.409554 4773 scope.go:117] "RemoveContainer" containerID="1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.413336 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b"} err="failed to get container status \"1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b\": rpc error: code = NotFound desc = could not find container \"1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b\": container with ID starting with 1811cd886792ca073b7b3fc4b71bcd633f865b3b042d86447128d68bffbf899b not found: ID does not exist" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.466848 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.497147 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8f97dc-0f26-42c1-af15-67632a442b68" path="/var/lib/kubelet/pods/4d8f97dc-0f26-42c1-af15-67632a442b68/volumes" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.498630 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.498655 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.498702 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6225801c-d77f-493c-834a-1393a8a1d239-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.498942 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fw9w\" (UniqueName: \"kubernetes.io/projected/6225801c-d77f-493c-834a-1393a8a1d239-kube-api-access-6fw9w\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.498988 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-config-data\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.499023 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6225801c-d77f-493c-834a-1393a8a1d239-ceph\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.499049 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.499080 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6225801c-d77f-493c-834a-1393a8a1d239-logs\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.499103 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-scripts\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.501233 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5000645-c8d4-437f-8624-a2b6175d99e8" path="/var/lib/kubelet/pods/a5000645-c8d4-437f-8624-a2b6175d99e8/volumes" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.600375 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fw9w\" (UniqueName: \"kubernetes.io/projected/6225801c-d77f-493c-834a-1393a8a1d239-kube-api-access-6fw9w\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.600464 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-config-data\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.600508 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6225801c-d77f-493c-834a-1393a8a1d239-ceph\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.600534 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.600567 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6225801c-d77f-493c-834a-1393a8a1d239-logs\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.600589 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-scripts\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.600647 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.600662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.600748 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6225801c-d77f-493c-834a-1393a8a1d239-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.604138 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.604212 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6225801c-d77f-493c-834a-1393a8a1d239-logs\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.604392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6225801c-d77f-493c-834a-1393a8a1d239-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.607883 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.613278 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6225801c-d77f-493c-834a-1393a8a1d239-ceph\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.613618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-scripts\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.615593 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.615890 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fw9w\" (UniqueName: \"kubernetes.io/projected/6225801c-d77f-493c-834a-1393a8a1d239-kube-api-access-6fw9w\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.618042 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6225801c-d77f-493c-834a-1393a8a1d239-config-data\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.640784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6225801c-d77f-493c-834a-1393a8a1d239\") " pod="openstack/glance-default-external-api-0" Oct 12 21:18:56 crc kubenswrapper[4773]: I1012 21:18:56.727072 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 21:18:57 crc kubenswrapper[4773]: I1012 21:18:57.333584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e607bc3-d77b-4dfb-a697-911f6dea3244","Type":"ContainerStarted","Data":"2198d29fcc7861344af879c738cf25a77d8f7ba1328b099ffc4cd751e84c0462"} Oct 12 21:18:57 crc kubenswrapper[4773]: I1012 21:18:57.334158 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e607bc3-d77b-4dfb-a697-911f6dea3244","Type":"ContainerStarted","Data":"f8804d0c09211529b8f3a99b45c6a673acd2a62529bf9aa5a27fd8c492f812c3"} Oct 12 21:18:57 crc kubenswrapper[4773]: I1012 21:18:57.643236 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 21:18:57 crc kubenswrapper[4773]: W1012 21:18:57.667020 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6225801c_d77f_493c_834a_1393a8a1d239.slice/crio-4084feeb180f36a3d08baf50b048c2deb421fa1a37f8e5723290a7d89e4f0255 WatchSource:0}: Error finding container 4084feeb180f36a3d08baf50b048c2deb421fa1a37f8e5723290a7d89e4f0255: Status 404 returned error can't find the container with id 4084feeb180f36a3d08baf50b048c2deb421fa1a37f8e5723290a7d89e4f0255 Oct 12 21:18:58 crc kubenswrapper[4773]: I1012 21:18:58.347689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6225801c-d77f-493c-834a-1393a8a1d239","Type":"ContainerStarted","Data":"0c730d6810396a8d7ca406481ad7d5410a15e9bb8d9ecca77325f04421ed8505"} Oct 12 21:18:58 crc kubenswrapper[4773]: I1012 21:18:58.348024 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6225801c-d77f-493c-834a-1393a8a1d239","Type":"ContainerStarted","Data":"4084feeb180f36a3d08baf50b048c2deb421fa1a37f8e5723290a7d89e4f0255"} Oct 12 21:18:58 crc kubenswrapper[4773]: I1012 21:18:58.351273 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e607bc3-d77b-4dfb-a697-911f6dea3244","Type":"ContainerStarted","Data":"750c236c575e2c26aa064c3c0c445c2ff0b5233d1d86ad2723f7b395c8dccbcf"} Oct 12 21:18:58 crc kubenswrapper[4773]: I1012 21:18:58.384931 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.384903629 podStartE2EDuration="3.384903629s" podCreationTimestamp="2025-10-12 21:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:18:58.378065142 +0000 UTC m=+3286.614363702" watchObservedRunningTime="2025-10-12 21:18:58.384903629 +0000 UTC m=+3286.621202199" Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.014597 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.113224 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.244282 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-8dc8-account-create-vvvfj"] Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.253385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-8dc8-account-create-vvvfj" Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.254394 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-8dc8-account-create-vvvfj"] Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.257842 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.351974 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tblf\" (UniqueName: \"kubernetes.io/projected/30baf00d-ffd7-4717-86a5-a5d97f990a5f-kube-api-access-4tblf\") pod \"manila-8dc8-account-create-vvvfj\" (UID: \"30baf00d-ffd7-4717-86a5-a5d97f990a5f\") " pod="openstack/manila-8dc8-account-create-vvvfj" Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.454366 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tblf\" (UniqueName: \"kubernetes.io/projected/30baf00d-ffd7-4717-86a5-a5d97f990a5f-kube-api-access-4tblf\") pod \"manila-8dc8-account-create-vvvfj\" (UID: \"30baf00d-ffd7-4717-86a5-a5d97f990a5f\") " pod="openstack/manila-8dc8-account-create-vvvfj" Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.479814 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tblf\" (UniqueName: \"kubernetes.io/projected/30baf00d-ffd7-4717-86a5-a5d97f990a5f-kube-api-access-4tblf\") pod \"manila-8dc8-account-create-vvvfj\" (UID: \"30baf00d-ffd7-4717-86a5-a5d97f990a5f\") " pod="openstack/manila-8dc8-account-create-vvvfj" Oct 12 21:18:59 crc kubenswrapper[4773]: I1012 21:18:59.575871 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-8dc8-account-create-vvvfj" Oct 12 21:19:03 crc kubenswrapper[4773]: I1012 21:19:03.957482 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-8dc8-account-create-vvvfj"] Oct 12 21:19:03 crc kubenswrapper[4773]: W1012 21:19:03.967472 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30baf00d_ffd7_4717_86a5_a5d97f990a5f.slice/crio-468bb08ca8c06e972d5f83867512cea6c73350164cb18b6ce27213987450c38f WatchSource:0}: Error finding container 468bb08ca8c06e972d5f83867512cea6c73350164cb18b6ce27213987450c38f: Status 404 returned error can't find the container with id 468bb08ca8c06e972d5f83867512cea6c73350164cb18b6ce27213987450c38f Oct 12 21:19:04 crc kubenswrapper[4773]: I1012 21:19:04.419909 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-748d9fcc4-b9gp4" event={"ID":"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb","Type":"ContainerStarted","Data":"592c0b7b3acf5884f2e1c8fc27ff80b95861279d02e3f59c8127b237c90c16a6"} Oct 12 21:19:04 crc kubenswrapper[4773]: I1012 21:19:04.424732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6225801c-d77f-493c-834a-1393a8a1d239","Type":"ContainerStarted","Data":"2cd679b0f6cfacbf394b1926f897226657d743eb13e0c65f767ff2a8531c9fe4"} Oct 12 21:19:04 crc kubenswrapper[4773]: I1012 21:19:04.426075 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b7789fcb9-fpqth" event={"ID":"e623e8d3-599d-4637-97ae-b94024c87ad6","Type":"ContainerStarted","Data":"ae5f4d7ea8b1290e488b6ead5a6f9fc18c308635d507dadcc54504b8494aa033"} Oct 12 21:19:04 crc kubenswrapper[4773]: I1012 21:19:04.428589 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bdbfb487c-6x9rh" event={"ID":"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69","Type":"ContainerStarted","Data":"8290dcb2347e60e610ed9f17629660c7c74e301c29a847d5173053ca2769acef"} Oct 12 21:19:04 crc kubenswrapper[4773]: I1012 21:19:04.430780 4773 generic.go:334] "Generic (PLEG): container finished" podID="30baf00d-ffd7-4717-86a5-a5d97f990a5f" containerID="977740cf82be5cb8c53419b0a218650845537a46295c9766904c19142bccc317" exitCode=0 Oct 12 21:19:04 crc kubenswrapper[4773]: I1012 21:19:04.430834 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-8dc8-account-create-vvvfj" event={"ID":"30baf00d-ffd7-4717-86a5-a5d97f990a5f","Type":"ContainerDied","Data":"977740cf82be5cb8c53419b0a218650845537a46295c9766904c19142bccc317"} Oct 12 21:19:04 crc kubenswrapper[4773]: I1012 21:19:04.430857 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-8dc8-account-create-vvvfj" event={"ID":"30baf00d-ffd7-4717-86a5-a5d97f990a5f","Type":"ContainerStarted","Data":"468bb08ca8c06e972d5f83867512cea6c73350164cb18b6ce27213987450c38f"} Oct 12 21:19:04 crc kubenswrapper[4773]: I1012 21:19:04.459710 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.459687862 podStartE2EDuration="8.459687862s" podCreationTimestamp="2025-10-12 21:18:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:19:04.448203278 +0000 UTC m=+3292.684501848" watchObservedRunningTime="2025-10-12 21:19:04.459687862 +0000 UTC m=+3292.695986422" Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.443063 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bdbfb487c-6x9rh" event={"ID":"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69","Type":"ContainerStarted","Data":"559857e25eae55e91ecee30d806b4f7749071b9ea35eb75f2cf53cffdd2c0b0e"} Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.443529 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bdbfb487c-6x9rh" podUID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerName="horizon-log" containerID="cri-o://8290dcb2347e60e610ed9f17629660c7c74e301c29a847d5173053ca2769acef" gracePeriod=30 Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.443964 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bdbfb487c-6x9rh" podUID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerName="horizon" containerID="cri-o://559857e25eae55e91ecee30d806b4f7749071b9ea35eb75f2cf53cffdd2c0b0e" gracePeriod=30 Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.450124 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-748d9fcc4-b9gp4" event={"ID":"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb","Type":"ContainerStarted","Data":"9139515c9bc911481c56d15cc53d5964d7bd8c93d7db9cb8e171178992bbc3d7"} Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.454020 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5486cbb4-g66c2" event={"ID":"7f878004-f437-4db3-a695-09d92a0bc6e4","Type":"ContainerStarted","Data":"26fd9672dac2bf028c35d7603f55056a99cc9641cf8ed2c083086c1fdc1a32ce"} Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.454867 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5486cbb4-g66c2" event={"ID":"7f878004-f437-4db3-a695-09d92a0bc6e4","Type":"ContainerStarted","Data":"eb7032fd148a3bf51ec65137ddc16b40bde51fa3d0136f61d52ca0ffeef5ba43"} Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.458859 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b7789fcb9-fpqth" event={"ID":"e623e8d3-599d-4637-97ae-b94024c87ad6","Type":"ContainerStarted","Data":"63e0ca7c1959cf9ba215c65a73ffc2021de44d256f33088cab2de217210e29a3"} Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.459226 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6b7789fcb9-fpqth" podUID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerName="horizon-log" containerID="cri-o://ae5f4d7ea8b1290e488b6ead5a6f9fc18c308635d507dadcc54504b8494aa033" gracePeriod=30 Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.459416 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6b7789fcb9-fpqth" podUID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerName="horizon" containerID="cri-o://63e0ca7c1959cf9ba215c65a73ffc2021de44d256f33088cab2de217210e29a3" gracePeriod=30 Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.494528 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7bdbfb487c-6x9rh" podStartSLOduration=3.459277955 podStartE2EDuration="16.494506142s" podCreationTimestamp="2025-10-12 21:18:49 +0000 UTC" firstStartedPulling="2025-10-12 21:18:51.120398769 +0000 UTC m=+3279.356697319" lastFinishedPulling="2025-10-12 21:19:04.155626946 +0000 UTC m=+3292.391925506" observedRunningTime="2025-10-12 21:19:05.47797697 +0000 UTC m=+3293.714275530" watchObservedRunningTime="2025-10-12 21:19:05.494506142 +0000 UTC m=+3293.730804712" Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.510187 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f5486cbb4-g66c2" podStartSLOduration=2.7490328550000003 podStartE2EDuration="13.51016661s" podCreationTimestamp="2025-10-12 21:18:52 +0000 UTC" firstStartedPulling="2025-10-12 21:18:53.516810876 +0000 UTC m=+3281.753109436" lastFinishedPulling="2025-10-12 21:19:04.277944641 +0000 UTC m=+3292.514243191" observedRunningTime="2025-10-12 21:19:05.497386861 +0000 UTC m=+3293.733685441" watchObservedRunningTime="2025-10-12 21:19:05.51016661 +0000 UTC m=+3293.746465180" Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.537761 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-748d9fcc4-b9gp4" podStartSLOduration=3.585847212 podStartE2EDuration="14.536702476s" podCreationTimestamp="2025-10-12 21:18:51 +0000 UTC" firstStartedPulling="2025-10-12 21:18:53.150764576 +0000 UTC m=+3281.387063136" lastFinishedPulling="2025-10-12 21:19:04.10161984 +0000 UTC m=+3292.337918400" observedRunningTime="2025-10-12 21:19:05.530008133 +0000 UTC m=+3293.766306693" watchObservedRunningTime="2025-10-12 21:19:05.536702476 +0000 UTC m=+3293.773001046" Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.549741 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6b7789fcb9-fpqth" podStartSLOduration=2.6085382299999997 podStartE2EDuration="16.549709172s" podCreationTimestamp="2025-10-12 21:18:49 +0000 UTC" firstStartedPulling="2025-10-12 21:18:50.221526367 +0000 UTC m=+3278.457824917" lastFinishedPulling="2025-10-12 21:19:04.162697299 +0000 UTC m=+3292.398995859" observedRunningTime="2025-10-12 21:19:05.547521332 +0000 UTC m=+3293.783819892" watchObservedRunningTime="2025-10-12 21:19:05.549709172 +0000 UTC m=+3293.786007732" Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.745251 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.745309 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.808385 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.824007 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 21:19:05 crc kubenswrapper[4773]: I1012 21:19:05.965938 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-8dc8-account-create-vvvfj" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.091691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tblf\" (UniqueName: \"kubernetes.io/projected/30baf00d-ffd7-4717-86a5-a5d97f990a5f-kube-api-access-4tblf\") pod \"30baf00d-ffd7-4717-86a5-a5d97f990a5f\" (UID: \"30baf00d-ffd7-4717-86a5-a5d97f990a5f\") " Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.100122 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30baf00d-ffd7-4717-86a5-a5d97f990a5f-kube-api-access-4tblf" (OuterVolumeSpecName: "kube-api-access-4tblf") pod "30baf00d-ffd7-4717-86a5-a5d97f990a5f" (UID: "30baf00d-ffd7-4717-86a5-a5d97f990a5f"). InnerVolumeSpecName "kube-api-access-4tblf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.194323 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tblf\" (UniqueName: \"kubernetes.io/projected/30baf00d-ffd7-4717-86a5-a5d97f990a5f-kube-api-access-4tblf\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.469897 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-8dc8-account-create-vvvfj" event={"ID":"30baf00d-ffd7-4717-86a5-a5d97f990a5f","Type":"ContainerDied","Data":"468bb08ca8c06e972d5f83867512cea6c73350164cb18b6ce27213987450c38f"} Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.469958 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468bb08ca8c06e972d5f83867512cea6c73350164cb18b6ce27213987450c38f" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.471077 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-8dc8-account-create-vvvfj" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.471118 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.471166 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.728368 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.729932 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.761294 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 21:19:06 crc kubenswrapper[4773]: I1012 21:19:06.787688 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 21:19:07 crc kubenswrapper[4773]: I1012 21:19:07.476842 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 21:19:07 crc kubenswrapper[4773]: I1012 21:19:07.477699 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.510740 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-hvcc5"] Oct 12 21:19:09 crc kubenswrapper[4773]: E1012 21:19:09.511366 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30baf00d-ffd7-4717-86a5-a5d97f990a5f" containerName="mariadb-account-create" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.511379 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="30baf00d-ffd7-4717-86a5-a5d97f990a5f" containerName="mariadb-account-create" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.511550 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="30baf00d-ffd7-4717-86a5-a5d97f990a5f" containerName="mariadb-account-create" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.512122 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.513510 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-hz8pk" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.513931 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.527769 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-hvcc5"] Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.666293 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-job-config-data\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.666338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-combined-ca-bundle\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.666360 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-config-data\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.666481 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnsq5\" (UniqueName: \"kubernetes.io/projected/5f8d0c7f-e847-428d-968b-10e78d9c3680-kube-api-access-dnsq5\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.681797 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.768300 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-job-config-data\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.768343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-combined-ca-bundle\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.768363 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-config-data\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.768437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnsq5\" (UniqueName: \"kubernetes.io/projected/5f8d0c7f-e847-428d-968b-10e78d9c3680-kube-api-access-dnsq5\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.781626 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-config-data\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.785540 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-job-config-data\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.786216 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnsq5\" (UniqueName: \"kubernetes.io/projected/5f8d0c7f-e847-428d-968b-10e78d9c3680-kube-api-access-dnsq5\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.787177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-combined-ca-bundle\") pod \"manila-db-sync-hvcc5\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:09 crc kubenswrapper[4773]: I1012 21:19:09.827588 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:10 crc kubenswrapper[4773]: I1012 21:19:10.259397 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:19:10 crc kubenswrapper[4773]: I1012 21:19:10.337945 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 21:19:10 crc kubenswrapper[4773]: I1012 21:19:10.338043 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 21:19:10 crc kubenswrapper[4773]: I1012 21:19:10.372506 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 21:19:10 crc kubenswrapper[4773]: I1012 21:19:10.456698 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-hvcc5"] Oct 12 21:19:10 crc kubenswrapper[4773]: I1012 21:19:10.510687 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-hvcc5" event={"ID":"5f8d0c7f-e847-428d-968b-10e78d9c3680","Type":"ContainerStarted","Data":"33b2a6e7f2f094724b5813f7eded20a421b0ac905d2e3ef3c5dc02cce64e1edb"} Oct 12 21:19:10 crc kubenswrapper[4773]: I1012 21:19:10.567736 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 21:19:12 crc kubenswrapper[4773]: I1012 21:19:12.207551 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:19:12 crc kubenswrapper[4773]: I1012 21:19:12.208817 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:19:12 crc kubenswrapper[4773]: I1012 21:19:12.631548 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:19:12 crc kubenswrapper[4773]: I1012 21:19:12.631591 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:19:13 crc kubenswrapper[4773]: I1012 21:19:13.464763 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 21:19:18 crc kubenswrapper[4773]: I1012 21:19:18.605364 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-hvcc5" event={"ID":"5f8d0c7f-e847-428d-968b-10e78d9c3680","Type":"ContainerStarted","Data":"e026200ac82a1075137ab9d7f523d33a1bd92905ef0743494a6f69a239c36ce0"} Oct 12 21:19:22 crc kubenswrapper[4773]: I1012 21:19:22.209241 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-748d9fcc4-b9gp4" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Oct 12 21:19:22 crc kubenswrapper[4773]: I1012 21:19:22.638302 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f5486cbb4-g66c2" podUID="7f878004-f437-4db3-a695-09d92a0bc6e4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.244:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.244:8443: connect: connection refused" Oct 12 21:19:26 crc kubenswrapper[4773]: I1012 21:19:26.686945 4773 generic.go:334] "Generic (PLEG): container finished" podID="5f8d0c7f-e847-428d-968b-10e78d9c3680" containerID="e026200ac82a1075137ab9d7f523d33a1bd92905ef0743494a6f69a239c36ce0" exitCode=0 Oct 12 21:19:26 crc kubenswrapper[4773]: I1012 21:19:26.686998 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-hvcc5" event={"ID":"5f8d0c7f-e847-428d-968b-10e78d9c3680","Type":"ContainerDied","Data":"e026200ac82a1075137ab9d7f523d33a1bd92905ef0743494a6f69a239c36ce0"} Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.176388 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.280340 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnsq5\" (UniqueName: \"kubernetes.io/projected/5f8d0c7f-e847-428d-968b-10e78d9c3680-kube-api-access-dnsq5\") pod \"5f8d0c7f-e847-428d-968b-10e78d9c3680\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.280550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-combined-ca-bundle\") pod \"5f8d0c7f-e847-428d-968b-10e78d9c3680\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.280633 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-job-config-data\") pod \"5f8d0c7f-e847-428d-968b-10e78d9c3680\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.280674 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-config-data\") pod \"5f8d0c7f-e847-428d-968b-10e78d9c3680\" (UID: \"5f8d0c7f-e847-428d-968b-10e78d9c3680\") " Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.289957 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8d0c7f-e847-428d-968b-10e78d9c3680-kube-api-access-dnsq5" (OuterVolumeSpecName: "kube-api-access-dnsq5") pod "5f8d0c7f-e847-428d-968b-10e78d9c3680" (UID: "5f8d0c7f-e847-428d-968b-10e78d9c3680"). InnerVolumeSpecName "kube-api-access-dnsq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.298906 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "5f8d0c7f-e847-428d-968b-10e78d9c3680" (UID: "5f8d0c7f-e847-428d-968b-10e78d9c3680"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.313948 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-config-data" (OuterVolumeSpecName: "config-data") pod "5f8d0c7f-e847-428d-968b-10e78d9c3680" (UID: "5f8d0c7f-e847-428d-968b-10e78d9c3680"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.315611 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f8d0c7f-e847-428d-968b-10e78d9c3680" (UID: "5f8d0c7f-e847-428d-968b-10e78d9c3680"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.382297 4773 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.382523 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.382598 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnsq5\" (UniqueName: \"kubernetes.io/projected/5f8d0c7f-e847-428d-968b-10e78d9c3680-kube-api-access-dnsq5\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.382653 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8d0c7f-e847-428d-968b-10e78d9c3680-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.704952 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-hvcc5" event={"ID":"5f8d0c7f-e847-428d-968b-10e78d9c3680","Type":"ContainerDied","Data":"33b2a6e7f2f094724b5813f7eded20a421b0ac905d2e3ef3c5dc02cce64e1edb"} Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.704990 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b2a6e7f2f094724b5813f7eded20a421b0ac905d2e3ef3c5dc02cce64e1edb" Oct 12 21:19:28 crc kubenswrapper[4773]: I1012 21:19:28.705047 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-hvcc5" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.132685 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 12 21:19:29 crc kubenswrapper[4773]: E1012 21:19:29.133125 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8d0c7f-e847-428d-968b-10e78d9c3680" containerName="manila-db-sync" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.133142 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8d0c7f-e847-428d-968b-10e78d9c3680" containerName="manila-db-sync" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.133367 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8d0c7f-e847-428d-968b-10e78d9c3680" containerName="manila-db-sync" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.134604 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.144967 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.144996 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.145216 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-hz8pk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.145388 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.179662 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7887c4559f-fs5qk"] Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.182919 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.247977 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.300084 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7887c4559f-fs5qk"] Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.310801 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.310983 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knl7n\" (UniqueName: \"kubernetes.io/projected/6875c763-6837-4c47-8738-b66b6d4e6306-kube-api-access-knl7n\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311041 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-config\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311293 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-openstack-edpm-ipam\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311331 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311358 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-ovsdbserver-nb\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311402 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311467 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-dns-svc\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311502 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311556 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9547\" (UniqueName: \"kubernetes.io/projected/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-kube-api-access-x9547\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-ovsdbserver-sb\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.311662 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-scripts\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.317804 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.319852 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.322097 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.358164 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.380417 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.382943 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.384791 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.397159 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.412817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.412872 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knl7n\" (UniqueName: \"kubernetes.io/projected/6875c763-6837-4c47-8738-b66b6d4e6306-kube-api-access-knl7n\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.412897 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-config\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.412934 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.412954 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.412971 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.412986 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-scripts\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413001 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbsqn\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-kube-api-access-vbsqn\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413019 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-openstack-edpm-ipam\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413071 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-ovsdbserver-nb\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413108 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413155 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-dns-svc\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413175 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413199 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-ceph\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413216 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9547\" (UniqueName: \"kubernetes.io/projected/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-kube-api-access-x9547\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413233 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-ovsdbserver-sb\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.413270 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-scripts\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.415644 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-openstack-edpm-ipam\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.416269 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-dns-svc\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.416371 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.416956 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-ovsdbserver-nb\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.430242 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-config\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.430799 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6875c763-6837-4c47-8738-b66b6d4e6306-ovsdbserver-sb\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.436238 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.438109 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.441980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-scripts\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.452844 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9547\" (UniqueName: \"kubernetes.io/projected/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-kube-api-access-x9547\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.457207 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knl7n\" (UniqueName: \"kubernetes.io/projected/6875c763-6837-4c47-8738-b66b6d4e6306-kube-api-access-knl7n\") pod \"dnsmasq-dns-7887c4559f-fs5qk\" (UID: \"6875c763-6837-4c47-8738-b66b6d4e6306\") " pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.462418 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data\") pod \"manila-scheduler-0\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.474559 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.512123 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.516684 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-etc-machine-id\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.516845 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-logs\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.516922 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-scripts\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.516987 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhsls\" (UniqueName: \"kubernetes.io/projected/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-kube-api-access-zhsls\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517075 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517144 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517215 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517276 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-scripts\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517341 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbsqn\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-kube-api-access-vbsqn\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517409 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517480 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517569 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517646 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517730 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data-custom\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.517812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-ceph\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.527519 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-ceph\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.528268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.528881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.531929 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.532102 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.539133 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-scripts\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.539202 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.560056 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbsqn\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-kube-api-access-vbsqn\") pod \"manila-share-share1-0\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.619418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-etc-machine-id\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.619493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-logs\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.619516 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-scripts\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.619542 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhsls\" (UniqueName: \"kubernetes.io/projected/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-kube-api-access-zhsls\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.619643 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.619708 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.619746 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data-custom\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.623050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-etc-machine-id\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.623936 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data-custom\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.626065 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-logs\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.628271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.638173 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-scripts\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.638506 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.648684 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.652753 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhsls\" (UniqueName: \"kubernetes.io/projected/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-kube-api-access-zhsls\") pod \"manila-api-0\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " pod="openstack/manila-api-0" Oct 12 21:19:29 crc kubenswrapper[4773]: I1012 21:19:29.711969 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 12 21:19:30 crc kubenswrapper[4773]: I1012 21:19:30.135534 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7887c4559f-fs5qk"] Oct 12 21:19:30 crc kubenswrapper[4773]: I1012 21:19:30.385028 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 12 21:19:30 crc kubenswrapper[4773]: I1012 21:19:30.479493 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 12 21:19:30 crc kubenswrapper[4773]: I1012 21:19:30.730950 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"715c75a4-941f-47e6-9ef1-9c490c0afbbd","Type":"ContainerStarted","Data":"d6c368a62a5a15c7b9ba2e42158042e3f829e9d216025d1a2d8c61efc287492b"} Oct 12 21:19:30 crc kubenswrapper[4773]: I1012 21:19:30.732530 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"713bb2e1-4bf8-44bf-8a4a-65e0f0def418","Type":"ContainerStarted","Data":"8d7c5b1d943fc78a6ecaa422ed7e3a672397c803b70ee441060b14cedbb46c29"} Oct 12 21:19:30 crc kubenswrapper[4773]: I1012 21:19:30.735239 4773 generic.go:334] "Generic (PLEG): container finished" podID="6875c763-6837-4c47-8738-b66b6d4e6306" containerID="884a4741fa288e56f3ab876e42e55b368e654cda13daddfb5d3ce212d8281801" exitCode=0 Oct 12 21:19:30 crc kubenswrapper[4773]: I1012 21:19:30.735281 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" event={"ID":"6875c763-6837-4c47-8738-b66b6d4e6306","Type":"ContainerDied","Data":"884a4741fa288e56f3ab876e42e55b368e654cda13daddfb5d3ce212d8281801"} Oct 12 21:19:30 crc kubenswrapper[4773]: I1012 21:19:30.735308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" event={"ID":"6875c763-6837-4c47-8738-b66b6d4e6306","Type":"ContainerStarted","Data":"8ac178441b905152b3f2258d07bf628606f5f1958021ed214c72267024b977a6"} Oct 12 21:19:31 crc kubenswrapper[4773]: I1012 21:19:31.109168 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 12 21:19:31 crc kubenswrapper[4773]: I1012 21:19:31.751291 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d","Type":"ContainerStarted","Data":"4c792c68c57cc00551989ec93069e16b8ebd3cb2e5b55939136e11d1c3b68cc2"} Oct 12 21:19:31 crc kubenswrapper[4773]: I1012 21:19:31.759218 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" event={"ID":"6875c763-6837-4c47-8738-b66b6d4e6306","Type":"ContainerStarted","Data":"6ace9815f14d473547af6d45480c27b333a1f252392f20c733cd4722106dc5cb"} Oct 12 21:19:31 crc kubenswrapper[4773]: I1012 21:19:31.759432 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:31 crc kubenswrapper[4773]: I1012 21:19:31.829145 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" podStartSLOduration=2.829126133 podStartE2EDuration="2.829126133s" podCreationTimestamp="2025-10-12 21:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:19:31.812174796 +0000 UTC m=+3320.048473356" watchObservedRunningTime="2025-10-12 21:19:31.829126133 +0000 UTC m=+3320.065424693" Oct 12 21:19:32 crc kubenswrapper[4773]: I1012 21:19:32.401845 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 12 21:19:32 crc kubenswrapper[4773]: I1012 21:19:32.777572 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d","Type":"ContainerStarted","Data":"80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa"} Oct 12 21:19:32 crc kubenswrapper[4773]: I1012 21:19:32.777902 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d","Type":"ContainerStarted","Data":"a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168"} Oct 12 21:19:32 crc kubenswrapper[4773]: I1012 21:19:32.777916 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerName="manila-api-log" containerID="cri-o://a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168" gracePeriod=30 Oct 12 21:19:32 crc kubenswrapper[4773]: I1012 21:19:32.778055 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerName="manila-api" containerID="cri-o://80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa" gracePeriod=30 Oct 12 21:19:32 crc kubenswrapper[4773]: I1012 21:19:32.778220 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 12 21:19:32 crc kubenswrapper[4773]: I1012 21:19:32.785876 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"713bb2e1-4bf8-44bf-8a4a-65e0f0def418","Type":"ContainerStarted","Data":"1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd"} Oct 12 21:19:32 crc kubenswrapper[4773]: I1012 21:19:32.800163 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.800145471 podStartE2EDuration="3.800145471s" podCreationTimestamp="2025-10-12 21:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:19:32.793054396 +0000 UTC m=+3321.029352946" watchObservedRunningTime="2025-10-12 21:19:32.800145471 +0000 UTC m=+3321.036444031" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.560183 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.676409 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-scripts\") pod \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.676909 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-etc-machine-id\") pod \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.677039 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhsls\" (UniqueName: \"kubernetes.io/projected/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-kube-api-access-zhsls\") pod \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.677167 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-combined-ca-bundle\") pod \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.677280 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data\") pod \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.677356 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data-custom\") pod \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.677437 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-logs\") pod \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\" (UID: \"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d\") " Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.678386 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-logs" (OuterVolumeSpecName: "logs") pod "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" (UID: "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.678810 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" (UID: "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.684023 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" (UID: "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.686748 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-kube-api-access-zhsls" (OuterVolumeSpecName: "kube-api-access-zhsls") pod "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" (UID: "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d"). InnerVolumeSpecName "kube-api-access-zhsls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.697922 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-scripts" (OuterVolumeSpecName: "scripts") pod "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" (UID: "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.758432 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data" (OuterVolumeSpecName: "config-data") pod "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" (UID: "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.770945 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" (UID: "1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.794343 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.794384 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.794396 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhsls\" (UniqueName: \"kubernetes.io/projected/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-kube-api-access-zhsls\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.794406 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.794415 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.794539 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.794551 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d-logs\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.815767 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerID="80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa" exitCode=143 Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.815799 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerID="a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168" exitCode=143 Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.815834 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d","Type":"ContainerDied","Data":"80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa"} Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.815860 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d","Type":"ContainerDied","Data":"a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168"} Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.815871 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d","Type":"ContainerDied","Data":"4c792c68c57cc00551989ec93069e16b8ebd3cb2e5b55939136e11d1c3b68cc2"} Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.815886 4773 scope.go:117] "RemoveContainer" containerID="80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.816048 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.841302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"713bb2e1-4bf8-44bf-8a4a-65e0f0def418","Type":"ContainerStarted","Data":"2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99"} Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.873851 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.895428 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.915448 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 12 21:19:33 crc kubenswrapper[4773]: E1012 21:19:33.916077 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerName="manila-api-log" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.916141 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerName="manila-api-log" Oct 12 21:19:33 crc kubenswrapper[4773]: E1012 21:19:33.916216 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerName="manila-api" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.916287 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerName="manila-api" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.916500 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerName="manila-api-log" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.916571 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" containerName="manila-api" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.917627 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.923240 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.935390 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.824659056 podStartE2EDuration="4.935368167s" podCreationTimestamp="2025-10-12 21:19:29 +0000 UTC" firstStartedPulling="2025-10-12 21:19:30.378839721 +0000 UTC m=+3318.615138281" lastFinishedPulling="2025-10-12 21:19:31.489548832 +0000 UTC m=+3319.725847392" observedRunningTime="2025-10-12 21:19:33.902300475 +0000 UTC m=+3322.138599035" watchObservedRunningTime="2025-10-12 21:19:33.935368167 +0000 UTC m=+3322.171666727" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.967544 4773 scope.go:117] "RemoveContainer" containerID="a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.971269 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.971381 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 12 21:19:33 crc kubenswrapper[4773]: I1012 21:19:33.972309 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.004815 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06283d24-b053-4893-9dab-4bfe5daf18b1-etc-machine-id\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.004879 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-config-data-custom\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.004943 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-scripts\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.004967 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-config-data\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.005003 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06283d24-b053-4893-9dab-4bfe5daf18b1-logs\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.005030 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-internal-tls-certs\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.005056 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-public-tls-certs\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.005124 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r8c6\" (UniqueName: \"kubernetes.io/projected/06283d24-b053-4893-9dab-4bfe5daf18b1-kube-api-access-4r8c6\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.005146 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.045761 4773 scope.go:117] "RemoveContainer" containerID="80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa" Oct 12 21:19:34 crc kubenswrapper[4773]: E1012 21:19:34.052253 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa\": container with ID starting with 80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa not found: ID does not exist" containerID="80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.052301 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa"} err="failed to get container status \"80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa\": rpc error: code = NotFound desc = could not find container \"80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa\": container with ID starting with 80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa not found: ID does not exist" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.052327 4773 scope.go:117] "RemoveContainer" containerID="a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168" Oct 12 21:19:34 crc kubenswrapper[4773]: E1012 21:19:34.056787 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168\": container with ID starting with a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168 not found: ID does not exist" containerID="a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.056836 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168"} err="failed to get container status \"a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168\": rpc error: code = NotFound desc = could not find container \"a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168\": container with ID starting with a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168 not found: ID does not exist" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.056859 4773 scope.go:117] "RemoveContainer" containerID="80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.059590 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa"} err="failed to get container status \"80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa\": rpc error: code = NotFound desc = could not find container \"80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa\": container with ID starting with 80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa not found: ID does not exist" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.059644 4773 scope.go:117] "RemoveContainer" containerID="a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.061023 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168"} err="failed to get container status \"a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168\": rpc error: code = NotFound desc = could not find container \"a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168\": container with ID starting with a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168 not found: ID does not exist" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.106303 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r8c6\" (UniqueName: \"kubernetes.io/projected/06283d24-b053-4893-9dab-4bfe5daf18b1-kube-api-access-4r8c6\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.106372 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.106460 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06283d24-b053-4893-9dab-4bfe5daf18b1-etc-machine-id\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.106498 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-config-data-custom\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.106551 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-scripts\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.106576 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-config-data\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.106617 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06283d24-b053-4893-9dab-4bfe5daf18b1-logs\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.106652 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-internal-tls-certs\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.106691 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-public-tls-certs\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.107233 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06283d24-b053-4893-9dab-4bfe5daf18b1-etc-machine-id\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.108004 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06283d24-b053-4893-9dab-4bfe5daf18b1-logs\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.112507 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-scripts\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.114364 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-public-tls-certs\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.123664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-internal-tls-certs\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.125358 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-config-data\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.127786 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.129285 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r8c6\" (UniqueName: \"kubernetes.io/projected/06283d24-b053-4893-9dab-4bfe5daf18b1-kube-api-access-4r8c6\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.129521 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06283d24-b053-4893-9dab-4bfe5daf18b1-config-data-custom\") pod \"manila-api-0\" (UID: \"06283d24-b053-4893-9dab-4bfe5daf18b1\") " pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.284188 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 12 21:19:34 crc kubenswrapper[4773]: I1012 21:19:34.508073 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d" path="/var/lib/kubelet/pods/1ab3d4ad-1c5a-49e1-928d-c30dd4a8130d/volumes" Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.212778 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 12 21:19:35 crc kubenswrapper[4773]: W1012 21:19:35.519295 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-4c792c68c57cc00551989ec93069e16b8ebd3cb2e5b55939136e11d1c3b68cc2": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-4c792c68c57cc00551989ec93069e16b8ebd3cb2e5b55939136e11d1c3b68cc2: no such file or directory Oct 12 21:19:35 crc kubenswrapper[4773]: W1012 21:19:35.519619 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-conmon-a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-conmon-a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168.scope: no such file or directory Oct 12 21:19:35 crc kubenswrapper[4773]: W1012 21:19:35.519634 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-a8c9a089f6563a38492371bb2236002b5c9dfe199d940de6889c0113d81a1168.scope: no such file or directory Oct 12 21:19:35 crc kubenswrapper[4773]: W1012 21:19:35.521828 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-conmon-80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-conmon-80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa.scope: no such file or directory Oct 12 21:19:35 crc kubenswrapper[4773]: W1012 21:19:35.521869 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice/crio-80325091138d5d035f1179e8788d7685c3121d240f6c2b6e5a1bff29b99fb8fa.scope: no such file or directory Oct 12 21:19:35 crc kubenswrapper[4773]: E1012 21:19:35.854552 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5354720_d7d2_4b8c_b4e6_ed0f8c546b69.slice/crio-conmon-8290dcb2347e60e610ed9f17629660c7c74e301c29a847d5173053ca2769acef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5354720_d7d2_4b8c_b4e6_ed0f8c546b69.slice/crio-559857e25eae55e91ecee30d806b4f7749071b9ea35eb75f2cf53cffdd2c0b0e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode623e8d3_599d_4637_97ae_b94024c87ad6.slice/crio-63e0ca7c1959cf9ba215c65a73ffc2021de44d256f33088cab2de217210e29a3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5354720_d7d2_4b8c_b4e6_ed0f8c546b69.slice/crio-8290dcb2347e60e610ed9f17629660c7c74e301c29a847d5173053ca2769acef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode623e8d3_599d_4637_97ae_b94024c87ad6.slice/crio-conmon-ae5f4d7ea8b1290e488b6ead5a6f9fc18c308635d507dadcc54504b8494aa033.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode623e8d3_599d_4637_97ae_b94024c87ad6.slice/crio-ae5f4d7ea8b1290e488b6ead5a6f9fc18c308635d507dadcc54504b8494aa033.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab3d4ad_1c5a_49e1_928d_c30dd4a8130d.slice\": RecentStats: unable to find data in memory cache]" Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.915838 4773 generic.go:334] "Generic (PLEG): container finished" podID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerID="63e0ca7c1959cf9ba215c65a73ffc2021de44d256f33088cab2de217210e29a3" exitCode=137 Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.915868 4773 generic.go:334] "Generic (PLEG): container finished" podID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerID="ae5f4d7ea8b1290e488b6ead5a6f9fc18c308635d507dadcc54504b8494aa033" exitCode=137 Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.915903 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b7789fcb9-fpqth" event={"ID":"e623e8d3-599d-4637-97ae-b94024c87ad6","Type":"ContainerDied","Data":"63e0ca7c1959cf9ba215c65a73ffc2021de44d256f33088cab2de217210e29a3"} Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.915925 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b7789fcb9-fpqth" event={"ID":"e623e8d3-599d-4637-97ae-b94024c87ad6","Type":"ContainerDied","Data":"ae5f4d7ea8b1290e488b6ead5a6f9fc18c308635d507dadcc54504b8494aa033"} Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.928226 4773 generic.go:334] "Generic (PLEG): container finished" podID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerID="559857e25eae55e91ecee30d806b4f7749071b9ea35eb75f2cf53cffdd2c0b0e" exitCode=137 Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.928252 4773 generic.go:334] "Generic (PLEG): container finished" podID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerID="8290dcb2347e60e610ed9f17629660c7c74e301c29a847d5173053ca2769acef" exitCode=137 Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.928293 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bdbfb487c-6x9rh" event={"ID":"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69","Type":"ContainerDied","Data":"559857e25eae55e91ecee30d806b4f7749071b9ea35eb75f2cf53cffdd2c0b0e"} Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.928314 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bdbfb487c-6x9rh" event={"ID":"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69","Type":"ContainerDied","Data":"8290dcb2347e60e610ed9f17629660c7c74e301c29a847d5173053ca2769acef"} Oct 12 21:19:35 crc kubenswrapper[4773]: I1012 21:19:35.937962 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"06283d24-b053-4893-9dab-4bfe5daf18b1","Type":"ContainerStarted","Data":"a9761c6c424369a8ec71c143ae78df695d7ac5baa994d54d705a0d223b10af87"} Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.098214 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.118601 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.177646 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-scripts\") pod \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.178130 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-scripts\") pod \"e623e8d3-599d-4637-97ae-b94024c87ad6\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.178174 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-config-data\") pod \"e623e8d3-599d-4637-97ae-b94024c87ad6\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.178200 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-config-data\") pod \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.178261 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623e8d3-599d-4637-97ae-b94024c87ad6-logs\") pod \"e623e8d3-599d-4637-97ae-b94024c87ad6\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.178298 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mr9c\" (UniqueName: \"kubernetes.io/projected/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-kube-api-access-5mr9c\") pod \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.178346 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-horizon-secret-key\") pod \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.178372 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e623e8d3-599d-4637-97ae-b94024c87ad6-horizon-secret-key\") pod \"e623e8d3-599d-4637-97ae-b94024c87ad6\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.178400 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-logs\") pod \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\" (UID: \"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.178427 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vms52\" (UniqueName: \"kubernetes.io/projected/e623e8d3-599d-4637-97ae-b94024c87ad6-kube-api-access-vms52\") pod \"e623e8d3-599d-4637-97ae-b94024c87ad6\" (UID: \"e623e8d3-599d-4637-97ae-b94024c87ad6\") " Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.183697 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" (UID: "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.183978 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e623e8d3-599d-4637-97ae-b94024c87ad6-logs" (OuterVolumeSpecName: "logs") pod "e623e8d3-599d-4637-97ae-b94024c87ad6" (UID: "e623e8d3-599d-4637-97ae-b94024c87ad6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.186248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e623e8d3-599d-4637-97ae-b94024c87ad6-kube-api-access-vms52" (OuterVolumeSpecName: "kube-api-access-vms52") pod "e623e8d3-599d-4637-97ae-b94024c87ad6" (UID: "e623e8d3-599d-4637-97ae-b94024c87ad6"). InnerVolumeSpecName "kube-api-access-vms52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.191913 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-logs" (OuterVolumeSpecName: "logs") pod "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" (UID: "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.199157 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-kube-api-access-5mr9c" (OuterVolumeSpecName: "kube-api-access-5mr9c") pod "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" (UID: "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69"). InnerVolumeSpecName "kube-api-access-5mr9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.208824 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e623e8d3-599d-4637-97ae-b94024c87ad6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e623e8d3-599d-4637-97ae-b94024c87ad6" (UID: "e623e8d3-599d-4637-97ae-b94024c87ad6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.244205 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-config-data" (OuterVolumeSpecName: "config-data") pod "e623e8d3-599d-4637-97ae-b94024c87ad6" (UID: "e623e8d3-599d-4637-97ae-b94024c87ad6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.271537 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-scripts" (OuterVolumeSpecName: "scripts") pod "e623e8d3-599d-4637-97ae-b94024c87ad6" (UID: "e623e8d3-599d-4637-97ae-b94024c87ad6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.273487 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-config-data" (OuterVolumeSpecName: "config-data") pod "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" (UID: "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.279985 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-scripts" (OuterVolumeSpecName: "scripts") pod "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" (UID: "b5354720-d7d2-4b8c-b4e6-ed0f8c546b69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281171 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281188 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281196 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e623e8d3-599d-4637-97ae-b94024c87ad6-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281205 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281213 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623e8d3-599d-4637-97ae-b94024c87ad6-logs\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281222 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mr9c\" (UniqueName: \"kubernetes.io/projected/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-kube-api-access-5mr9c\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281238 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281247 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e623e8d3-599d-4637-97ae-b94024c87ad6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281255 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69-logs\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.281263 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vms52\" (UniqueName: \"kubernetes.io/projected/e623e8d3-599d-4637-97ae-b94024c87ad6-kube-api-access-vms52\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.953836 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"06283d24-b053-4893-9dab-4bfe5daf18b1","Type":"ContainerStarted","Data":"996316001af358225d85a1a86816a4ab2c5fcbadfd367dccdc7ded44799ded19"} Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.954593 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.954616 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"06283d24-b053-4893-9dab-4bfe5daf18b1","Type":"ContainerStarted","Data":"8996ad28cb64563d628fdf92f567dc4c4546cf4da2817cae0c6cf4a253931764"} Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.958821 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b7789fcb9-fpqth" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.958819 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b7789fcb9-fpqth" event={"ID":"e623e8d3-599d-4637-97ae-b94024c87ad6","Type":"ContainerDied","Data":"41055f4d2ff772e64364c67f79d9413eee394ae756a9a87babedcb5c72462197"} Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.958941 4773 scope.go:117] "RemoveContainer" containerID="63e0ca7c1959cf9ba215c65a73ffc2021de44d256f33088cab2de217210e29a3" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.964550 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bdbfb487c-6x9rh" event={"ID":"b5354720-d7d2-4b8c-b4e6-ed0f8c546b69","Type":"ContainerDied","Data":"b6cc93f7fcd2b9141a6d7aca4f5cf7a6d72a452273e4fc2187023bafdddb5690"} Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.964647 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bdbfb487c-6x9rh" Oct 12 21:19:36 crc kubenswrapper[4773]: I1012 21:19:36.972028 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.97200999 podStartE2EDuration="3.97200999s" podCreationTimestamp="2025-10-12 21:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:19:36.970801767 +0000 UTC m=+3325.207100327" watchObservedRunningTime="2025-10-12 21:19:36.97200999 +0000 UTC m=+3325.208308550" Oct 12 21:19:37 crc kubenswrapper[4773]: I1012 21:19:37.003220 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bdbfb487c-6x9rh"] Oct 12 21:19:37 crc kubenswrapper[4773]: I1012 21:19:37.011811 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7bdbfb487c-6x9rh"] Oct 12 21:19:37 crc kubenswrapper[4773]: I1012 21:19:37.017706 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b7789fcb9-fpqth"] Oct 12 21:19:37 crc kubenswrapper[4773]: I1012 21:19:37.025733 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b7789fcb9-fpqth"] Oct 12 21:19:37 crc kubenswrapper[4773]: I1012 21:19:37.175625 4773 scope.go:117] "RemoveContainer" containerID="ae5f4d7ea8b1290e488b6ead5a6f9fc18c308635d507dadcc54504b8494aa033" Oct 12 21:19:37 crc kubenswrapper[4773]: I1012 21:19:37.211883 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-748d9fcc4-b9gp4" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 21:19:37 crc kubenswrapper[4773]: I1012 21:19:37.304252 4773 scope.go:117] "RemoveContainer" containerID="559857e25eae55e91ecee30d806b4f7749071b9ea35eb75f2cf53cffdd2c0b0e" Oct 12 21:19:37 crc kubenswrapper[4773]: I1012 21:19:37.493940 4773 scope.go:117] "RemoveContainer" containerID="8290dcb2347e60e610ed9f17629660c7c74e301c29a847d5173053ca2769acef" Oct 12 21:19:37 crc kubenswrapper[4773]: I1012 21:19:37.648083 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f5486cbb4-g66c2" podUID="7f878004-f437-4db3-a695-09d92a0bc6e4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.244:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 21:19:38 crc kubenswrapper[4773]: I1012 21:19:38.496255 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" path="/var/lib/kubelet/pods/b5354720-d7d2-4b8c-b4e6-ed0f8c546b69/volumes" Oct 12 21:19:38 crc kubenswrapper[4773]: I1012 21:19:38.497225 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e623e8d3-599d-4637-97ae-b94024c87ad6" path="/var/lib/kubelet/pods/e623e8d3-599d-4637-97ae-b94024c87ad6/volumes" Oct 12 21:19:39 crc kubenswrapper[4773]: I1012 21:19:39.474930 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 12 21:19:39 crc kubenswrapper[4773]: I1012 21:19:39.514587 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7887c4559f-fs5qk" Oct 12 21:19:39 crc kubenswrapper[4773]: I1012 21:19:39.591669 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-867c8fd5c5-t9wtm"] Oct 12 21:19:39 crc kubenswrapper[4773]: I1012 21:19:39.591902 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" podUID="2ec59ef6-fa6e-457f-9042-f77dfa673dde" containerName="dnsmasq-dns" containerID="cri-o://601481733d4fbe13c6bfb4d6f05339b018c18769d7ab14bfce96bc014b029e48" gracePeriod=10 Oct 12 21:19:40 crc kubenswrapper[4773]: I1012 21:19:40.017376 4773 generic.go:334] "Generic (PLEG): container finished" podID="2ec59ef6-fa6e-457f-9042-f77dfa673dde" containerID="601481733d4fbe13c6bfb4d6f05339b018c18769d7ab14bfce96bc014b029e48" exitCode=0 Oct 12 21:19:40 crc kubenswrapper[4773]: I1012 21:19:40.017435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" event={"ID":"2ec59ef6-fa6e-457f-9042-f77dfa673dde","Type":"ContainerDied","Data":"601481733d4fbe13c6bfb4d6f05339b018c18769d7ab14bfce96bc014b029e48"} Oct 12 21:19:42 crc kubenswrapper[4773]: I1012 21:19:42.921266 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.045628 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-nb\") pod \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.045940 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-config\") pod \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.045974 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-dns-svc\") pod \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.046066 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-sb\") pod \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.046133 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-openstack-edpm-ipam\") pod \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.046160 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfpn8\" (UniqueName: \"kubernetes.io/projected/2ec59ef6-fa6e-457f-9042-f77dfa673dde-kube-api-access-zfpn8\") pod \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\" (UID: \"2ec59ef6-fa6e-457f-9042-f77dfa673dde\") " Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.072669 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec59ef6-fa6e-457f-9042-f77dfa673dde-kube-api-access-zfpn8" (OuterVolumeSpecName: "kube-api-access-zfpn8") pod "2ec59ef6-fa6e-457f-9042-f77dfa673dde" (UID: "2ec59ef6-fa6e-457f-9042-f77dfa673dde"). InnerVolumeSpecName "kube-api-access-zfpn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.092895 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" event={"ID":"2ec59ef6-fa6e-457f-9042-f77dfa673dde","Type":"ContainerDied","Data":"ee026c72e6f4121a855b1555adc920ccd0547441cdd039fb63ab8a2b125aa6f3"} Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.092948 4773 scope.go:117] "RemoveContainer" containerID="601481733d4fbe13c6bfb4d6f05339b018c18769d7ab14bfce96bc014b029e48" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.093073 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867c8fd5c5-t9wtm" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.149999 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfpn8\" (UniqueName: \"kubernetes.io/projected/2ec59ef6-fa6e-457f-9042-f77dfa673dde-kube-api-access-zfpn8\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.152553 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ec59ef6-fa6e-457f-9042-f77dfa673dde" (UID: "2ec59ef6-fa6e-457f-9042-f77dfa673dde"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.157127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ec59ef6-fa6e-457f-9042-f77dfa673dde" (UID: "2ec59ef6-fa6e-457f-9042-f77dfa673dde"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.157158 4773 scope.go:117] "RemoveContainer" containerID="ac22d73968d727fd3afffa18c707f80741276767b4d9e94afe8b1fa53abe0858" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.163079 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2ec59ef6-fa6e-457f-9042-f77dfa673dde" (UID: "2ec59ef6-fa6e-457f-9042-f77dfa673dde"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.167573 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ec59ef6-fa6e-457f-9042-f77dfa673dde" (UID: "2ec59ef6-fa6e-457f-9042-f77dfa673dde"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.181122 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-config" (OuterVolumeSpecName: "config") pod "2ec59ef6-fa6e-457f-9042-f77dfa673dde" (UID: "2ec59ef6-fa6e-457f-9042-f77dfa673dde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.252197 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.252227 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-config\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.252235 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.252246 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.252254 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2ec59ef6-fa6e-457f-9042-f77dfa673dde-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.424382 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-867c8fd5c5-t9wtm"] Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.441983 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-867c8fd5c5-t9wtm"] Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.454494 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.455106 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="ceilometer-central-agent" containerID="cri-o://6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7" gracePeriod=30 Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.455856 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="ceilometer-notification-agent" containerID="cri-o://56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a" gracePeriod=30 Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.455889 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="sg-core" containerID="cri-o://57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f" gracePeriod=30 Oct 12 21:19:43 crc kubenswrapper[4773]: I1012 21:19:43.456033 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="proxy-httpd" containerID="cri-o://bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2" gracePeriod=30 Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.107635 4773 generic.go:334] "Generic (PLEG): container finished" podID="37da9366-2055-43bd-83d0-cab5606dec64" containerID="bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2" exitCode=0 Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.108834 4773 generic.go:334] "Generic (PLEG): container finished" podID="37da9366-2055-43bd-83d0-cab5606dec64" containerID="57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f" exitCode=2 Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.108943 4773 generic.go:334] "Generic (PLEG): container finished" podID="37da9366-2055-43bd-83d0-cab5606dec64" containerID="6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7" exitCode=0 Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.108798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerDied","Data":"bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2"} Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.109100 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerDied","Data":"57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f"} Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.109156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerDied","Data":"6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7"} Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.111677 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"715c75a4-941f-47e6-9ef1-9c490c0afbbd","Type":"ContainerStarted","Data":"a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d"} Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.111742 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"715c75a4-941f-47e6-9ef1-9c490c0afbbd","Type":"ContainerStarted","Data":"4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece"} Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.135966 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.152601059 podStartE2EDuration="15.135947134s" podCreationTimestamp="2025-10-12 21:19:29 +0000 UTC" firstStartedPulling="2025-10-12 21:19:30.507357974 +0000 UTC m=+3318.743656534" lastFinishedPulling="2025-10-12 21:19:42.490704049 +0000 UTC m=+3330.727002609" observedRunningTime="2025-10-12 21:19:44.130945856 +0000 UTC m=+3332.367244406" watchObservedRunningTime="2025-10-12 21:19:44.135947134 +0000 UTC m=+3332.372245694" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.522047 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec59ef6-fa6e-457f-9042-f77dfa673dde" path="/var/lib/kubelet/pods/2ec59ef6-fa6e-457f-9042-f77dfa673dde/volumes" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.854340 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.893630 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-ceilometer-tls-certs\") pod \"37da9366-2055-43bd-83d0-cab5606dec64\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.893733 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-run-httpd\") pod \"37da9366-2055-43bd-83d0-cab5606dec64\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.893753 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-log-httpd\") pod \"37da9366-2055-43bd-83d0-cab5606dec64\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.893790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-sg-core-conf-yaml\") pod \"37da9366-2055-43bd-83d0-cab5606dec64\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.893903 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-config-data\") pod \"37da9366-2055-43bd-83d0-cab5606dec64\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.893955 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-scripts\") pod \"37da9366-2055-43bd-83d0-cab5606dec64\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.893984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh87q\" (UniqueName: \"kubernetes.io/projected/37da9366-2055-43bd-83d0-cab5606dec64-kube-api-access-jh87q\") pod \"37da9366-2055-43bd-83d0-cab5606dec64\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.894000 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-combined-ca-bundle\") pod \"37da9366-2055-43bd-83d0-cab5606dec64\" (UID: \"37da9366-2055-43bd-83d0-cab5606dec64\") " Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.895041 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "37da9366-2055-43bd-83d0-cab5606dec64" (UID: "37da9366-2055-43bd-83d0-cab5606dec64"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.897220 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "37da9366-2055-43bd-83d0-cab5606dec64" (UID: "37da9366-2055-43bd-83d0-cab5606dec64"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.902909 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37da9366-2055-43bd-83d0-cab5606dec64-kube-api-access-jh87q" (OuterVolumeSpecName: "kube-api-access-jh87q") pod "37da9366-2055-43bd-83d0-cab5606dec64" (UID: "37da9366-2055-43bd-83d0-cab5606dec64"). InnerVolumeSpecName "kube-api-access-jh87q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.919569 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-scripts" (OuterVolumeSpecName: "scripts") pod "37da9366-2055-43bd-83d0-cab5606dec64" (UID: "37da9366-2055-43bd-83d0-cab5606dec64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.996712 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.996753 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh87q\" (UniqueName: \"kubernetes.io/projected/37da9366-2055-43bd-83d0-cab5606dec64-kube-api-access-jh87q\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.996764 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:44 crc kubenswrapper[4773]: I1012 21:19:44.996773 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37da9366-2055-43bd-83d0-cab5606dec64-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.048001 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "37da9366-2055-43bd-83d0-cab5606dec64" (UID: "37da9366-2055-43bd-83d0-cab5606dec64"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.053012 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37da9366-2055-43bd-83d0-cab5606dec64" (UID: "37da9366-2055-43bd-83d0-cab5606dec64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.061960 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "37da9366-2055-43bd-83d0-cab5606dec64" (UID: "37da9366-2055-43bd-83d0-cab5606dec64"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.072559 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-config-data" (OuterVolumeSpecName: "config-data") pod "37da9366-2055-43bd-83d0-cab5606dec64" (UID: "37da9366-2055-43bd-83d0-cab5606dec64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.098232 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.098264 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.098276 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.098284 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37da9366-2055-43bd-83d0-cab5606dec64-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.123091 4773 generic.go:334] "Generic (PLEG): container finished" podID="37da9366-2055-43bd-83d0-cab5606dec64" containerID="56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a" exitCode=0 Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.123188 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.123232 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerDied","Data":"56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a"} Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.123260 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37da9366-2055-43bd-83d0-cab5606dec64","Type":"ContainerDied","Data":"3e8b89f7e8ac34ff048eea1378a5c455138047a0c0b0a1a97b3cc66d98ba1017"} Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.123276 4773 scope.go:117] "RemoveContainer" containerID="bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.146102 4773 scope.go:117] "RemoveContainer" containerID="57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.163766 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.180480 4773 scope.go:117] "RemoveContainer" containerID="56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.182661 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.210203 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.210954 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ec59ef6-fa6e-457f-9042-f77dfa673dde" containerName="init" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.210973 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec59ef6-fa6e-457f-9042-f77dfa673dde" containerName="init" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.210997 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ec59ef6-fa6e-457f-9042-f77dfa673dde" containerName="dnsmasq-dns" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211004 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec59ef6-fa6e-457f-9042-f77dfa673dde" containerName="dnsmasq-dns" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.211015 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerName="horizon-log" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211021 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerName="horizon-log" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.211035 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="sg-core" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211040 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="sg-core" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.211053 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="ceilometer-central-agent" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211059 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="ceilometer-central-agent" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.211068 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerName="horizon" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211075 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerName="horizon" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.211093 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerName="horizon-log" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211099 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerName="horizon-log" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.211110 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerName="horizon" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211116 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerName="horizon" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.211127 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="proxy-httpd" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211135 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="proxy-httpd" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.211147 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="ceilometer-notification-agent" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211152 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="ceilometer-notification-agent" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211312 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="ceilometer-notification-agent" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211326 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="ceilometer-central-agent" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211336 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerName="horizon" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211346 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerName="horizon" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211354 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ec59ef6-fa6e-457f-9042-f77dfa673dde" containerName="dnsmasq-dns" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211366 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5354720-d7d2-4b8c-b4e6-ed0f8c546b69" containerName="horizon-log" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211373 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="sg-core" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211391 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e623e8d3-599d-4637-97ae-b94024c87ad6" containerName="horizon-log" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.211401 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="37da9366-2055-43bd-83d0-cab5606dec64" containerName="proxy-httpd" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.244259 4773 scope.go:117] "RemoveContainer" containerID="6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.257121 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.259510 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.260748 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.262286 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.271438 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.302372 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-config-data\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.302444 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44kgs\" (UniqueName: \"kubernetes.io/projected/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-kube-api-access-44kgs\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.302488 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.302484 4773 scope.go:117] "RemoveContainer" containerID="bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.302683 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-scripts\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.302827 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-run-httpd\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.302910 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.302964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.303009 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-log-httpd\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.303317 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2\": container with ID starting with bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2 not found: ID does not exist" containerID="bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.303354 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2"} err="failed to get container status \"bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2\": rpc error: code = NotFound desc = could not find container \"bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2\": container with ID starting with bee6ef8e4bb1a5698c14841d8a01401c0874a0f485d2d094c43df3e6fcd8eeb2 not found: ID does not exist" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.303379 4773 scope.go:117] "RemoveContainer" containerID="57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.303878 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f\": container with ID starting with 57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f not found: ID does not exist" containerID="57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.303948 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f"} err="failed to get container status \"57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f\": rpc error: code = NotFound desc = could not find container \"57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f\": container with ID starting with 57e592b28a88ed190d3a8639ae331b4c0a85f7597c16b89c01e0f167eb52517f not found: ID does not exist" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.303977 4773 scope.go:117] "RemoveContainer" containerID="56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.304308 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a\": container with ID starting with 56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a not found: ID does not exist" containerID="56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.304338 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a"} err="failed to get container status \"56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a\": rpc error: code = NotFound desc = could not find container \"56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a\": container with ID starting with 56a8512b3147fd3d1df3408aab780b0f02a9e8f1b833a5b665d712dfc35f880a not found: ID does not exist" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.304356 4773 scope.go:117] "RemoveContainer" containerID="6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7" Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.304597 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7\": container with ID starting with 6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7 not found: ID does not exist" containerID="6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.304626 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7"} err="failed to get container status \"6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7\": rpc error: code = NotFound desc = could not find container \"6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7\": container with ID starting with 6f9235d14d89cd25dd0f4248d8e17660e621502578cacd4b72bf8cc2464b29b7 not found: ID does not exist" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.338666 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:45 crc kubenswrapper[4773]: E1012 21:19:45.339505 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-44kgs log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.404308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-run-httpd\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.404358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.404386 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.404411 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-log-httpd\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.404463 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-config-data\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.404498 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44kgs\" (UniqueName: \"kubernetes.io/projected/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-kube-api-access-44kgs\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.404531 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.404580 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-scripts\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.405767 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-log-httpd\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.405976 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-run-httpd\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.409490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.409905 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.409910 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-config-data\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.410429 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-scripts\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.410644 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.424820 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44kgs\" (UniqueName: \"kubernetes.io/projected/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-kube-api-access-44kgs\") pod \"ceilometer-0\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " pod="openstack/ceilometer-0" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.608867 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:19:45 crc kubenswrapper[4773]: I1012 21:19:45.681843 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.147836 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.165360 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.319435 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-combined-ca-bundle\") pod \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.319480 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-sg-core-conf-yaml\") pod \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.319523 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-ceilometer-tls-certs\") pod \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.319620 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44kgs\" (UniqueName: \"kubernetes.io/projected/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-kube-api-access-44kgs\") pod \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.319669 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-run-httpd\") pod \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.319694 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-scripts\") pod \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.319766 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-log-httpd\") pod \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.319797 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-config-data\") pod \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\" (UID: \"e8c2f4d0-d630-4f87-81d5-ca5e434d50d3\") " Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.320291 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" (UID: "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.320934 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" (UID: "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.325514 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-config-data" (OuterVolumeSpecName: "config-data") pod "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" (UID: "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.326020 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-kube-api-access-44kgs" (OuterVolumeSpecName: "kube-api-access-44kgs") pod "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" (UID: "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3"). InnerVolumeSpecName "kube-api-access-44kgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.326015 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-scripts" (OuterVolumeSpecName: "scripts") pod "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" (UID: "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.328908 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" (UID: "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.328923 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" (UID: "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.351268 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" (UID: "e8c2f4d0-d630-4f87-81d5-ca5e434d50d3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.422442 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.422477 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.422487 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.422497 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44kgs\" (UniqueName: \"kubernetes.io/projected/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-kube-api-access-44kgs\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.422507 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.422515 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.422525 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.422533 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:46 crc kubenswrapper[4773]: I1012 21:19:46.492023 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37da9366-2055-43bd-83d0-cab5606dec64" path="/var/lib/kubelet/pods/37da9366-2055-43bd-83d0-cab5606dec64/volumes" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.154506 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.192630 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.199941 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.241530 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.243740 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.251230 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.253883 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.254014 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.254558 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.337652 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-config-data\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.337752 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-run-httpd\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.337772 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.337815 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.337920 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-scripts\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.338126 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6ww\" (UniqueName: \"kubernetes.io/projected/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-kube-api-access-qt6ww\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.338165 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-log-httpd\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.338214 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.439428 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.439483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-scripts\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.439546 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6ww\" (UniqueName: \"kubernetes.io/projected/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-kube-api-access-qt6ww\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.439570 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-log-httpd\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.439593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.439615 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-config-data\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.439694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-run-httpd\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.439750 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.439989 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-log-httpd\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.440258 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-run-httpd\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.445346 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.447067 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-scripts\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.447185 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.448159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.452485 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-config-data\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.458789 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6ww\" (UniqueName: \"kubernetes.io/projected/9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d-kube-api-access-qt6ww\") pod \"ceilometer-0\" (UID: \"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d\") " pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.567673 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 21:19:47 crc kubenswrapper[4773]: I1012 21:19:47.842947 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:19:48 crc kubenswrapper[4773]: I1012 21:19:48.049141 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 21:19:48 crc kubenswrapper[4773]: I1012 21:19:48.125471 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6f5486cbb4-g66c2" Oct 12 21:19:48 crc kubenswrapper[4773]: I1012 21:19:48.167032 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d","Type":"ContainerStarted","Data":"9be99bea0a3c3cb8d69f71ec12ab6fe9ffd59c7f7dc52e621597b9bdc1bb9c2b"} Oct 12 21:19:48 crc kubenswrapper[4773]: I1012 21:19:48.183819 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-748d9fcc4-b9gp4"] Oct 12 21:19:48 crc kubenswrapper[4773]: I1012 21:19:48.184063 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-748d9fcc4-b9gp4" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon-log" containerID="cri-o://592c0b7b3acf5884f2e1c8fc27ff80b95861279d02e3f59c8127b237c90c16a6" gracePeriod=30 Oct 12 21:19:48 crc kubenswrapper[4773]: I1012 21:19:48.188309 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-748d9fcc4-b9gp4" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon" containerID="cri-o://9139515c9bc911481c56d15cc53d5964d7bd8c93d7db9cb8e171178992bbc3d7" gracePeriod=30 Oct 12 21:19:48 crc kubenswrapper[4773]: I1012 21:19:48.493275 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c2f4d0-d630-4f87-81d5-ca5e434d50d3" path="/var/lib/kubelet/pods/e8c2f4d0-d630-4f87-81d5-ca5e434d50d3/volumes" Oct 12 21:19:49 crc kubenswrapper[4773]: I1012 21:19:49.177516 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d","Type":"ContainerStarted","Data":"ed95068dec155a5f77507989f587bdb799def07d1a2a66c741453c3f82bfe43a"} Oct 12 21:19:49 crc kubenswrapper[4773]: I1012 21:19:49.649945 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 12 21:19:50 crc kubenswrapper[4773]: I1012 21:19:50.187926 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d","Type":"ContainerStarted","Data":"7961db71a9eb552227229f76451acdef4fbd4ebb14604356e51ff6985361fea5"} Oct 12 21:19:51 crc kubenswrapper[4773]: I1012 21:19:51.203498 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d","Type":"ContainerStarted","Data":"d823d2e867cc9881b95e4f0699aa911d3c55fe272fad31b19e6d0d5151302d04"} Oct 12 21:19:51 crc kubenswrapper[4773]: I1012 21:19:51.774563 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 12 21:19:51 crc kubenswrapper[4773]: I1012 21:19:51.874218 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 12 21:19:52 crc kubenswrapper[4773]: I1012 21:19:52.208678 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-748d9fcc4-b9gp4" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Oct 12 21:19:52 crc kubenswrapper[4773]: I1012 21:19:52.215496 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d","Type":"ContainerStarted","Data":"0a376349ff546df13ca619bd72ff5e6390e5607be8a4b22c625b8aa1ff88c3e3"} Oct 12 21:19:52 crc kubenswrapper[4773]: I1012 21:19:52.215800 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 21:19:52 crc kubenswrapper[4773]: I1012 21:19:52.217742 4773 generic.go:334] "Generic (PLEG): container finished" podID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerID="9139515c9bc911481c56d15cc53d5964d7bd8c93d7db9cb8e171178992bbc3d7" exitCode=0 Oct 12 21:19:52 crc kubenswrapper[4773]: I1012 21:19:52.217824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-748d9fcc4-b9gp4" event={"ID":"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb","Type":"ContainerDied","Data":"9139515c9bc911481c56d15cc53d5964d7bd8c93d7db9cb8e171178992bbc3d7"} Oct 12 21:19:52 crc kubenswrapper[4773]: I1012 21:19:52.218078 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerName="manila-scheduler" containerID="cri-o://1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd" gracePeriod=30 Oct 12 21:19:52 crc kubenswrapper[4773]: I1012 21:19:52.218120 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerName="probe" containerID="cri-o://2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99" gracePeriod=30 Oct 12 21:19:52 crc kubenswrapper[4773]: I1012 21:19:52.239085 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.426131455 podStartE2EDuration="5.239065439s" podCreationTimestamp="2025-10-12 21:19:47 +0000 UTC" firstStartedPulling="2025-10-12 21:19:48.077941877 +0000 UTC m=+3336.314240427" lastFinishedPulling="2025-10-12 21:19:51.890875851 +0000 UTC m=+3340.127174411" observedRunningTime="2025-10-12 21:19:52.237743513 +0000 UTC m=+3340.474042073" watchObservedRunningTime="2025-10-12 21:19:52.239065439 +0000 UTC m=+3340.475363999" Oct 12 21:19:53 crc kubenswrapper[4773]: I1012 21:19:53.227620 4773 generic.go:334] "Generic (PLEG): container finished" podID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerID="2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99" exitCode=0 Oct 12 21:19:53 crc kubenswrapper[4773]: I1012 21:19:53.228951 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"713bb2e1-4bf8-44bf-8a4a-65e0f0def418","Type":"ContainerDied","Data":"2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99"} Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.684456 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.744490 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-combined-ca-bundle\") pod \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.744827 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data-custom\") pod \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.744947 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-scripts\") pod \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.745042 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9547\" (UniqueName: \"kubernetes.io/projected/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-kube-api-access-x9547\") pod \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.745126 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-etc-machine-id\") pod \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.745387 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "713bb2e1-4bf8-44bf-8a4a-65e0f0def418" (UID: "713bb2e1-4bf8-44bf-8a4a-65e0f0def418"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.745861 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data\") pod \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\" (UID: \"713bb2e1-4bf8-44bf-8a4a-65e0f0def418\") " Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.746399 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.769290 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-scripts" (OuterVolumeSpecName: "scripts") pod "713bb2e1-4bf8-44bf-8a4a-65e0f0def418" (UID: "713bb2e1-4bf8-44bf-8a4a-65e0f0def418"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.778081 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-kube-api-access-x9547" (OuterVolumeSpecName: "kube-api-access-x9547") pod "713bb2e1-4bf8-44bf-8a4a-65e0f0def418" (UID: "713bb2e1-4bf8-44bf-8a4a-65e0f0def418"). InnerVolumeSpecName "kube-api-access-x9547". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.782877 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "713bb2e1-4bf8-44bf-8a4a-65e0f0def418" (UID: "713bb2e1-4bf8-44bf-8a4a-65e0f0def418"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.843035 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.848025 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.848051 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.848061 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9547\" (UniqueName: \"kubernetes.io/projected/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-kube-api-access-x9547\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.888982 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "713bb2e1-4bf8-44bf-8a4a-65e0f0def418" (UID: "713bb2e1-4bf8-44bf-8a4a-65e0f0def418"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.944753 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data" (OuterVolumeSpecName: "config-data") pod "713bb2e1-4bf8-44bf-8a4a-65e0f0def418" (UID: "713bb2e1-4bf8-44bf-8a4a-65e0f0def418"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.950157 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:55 crc kubenswrapper[4773]: I1012 21:19:55.950183 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713bb2e1-4bf8-44bf-8a4a-65e0f0def418-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.258663 4773 generic.go:334] "Generic (PLEG): container finished" podID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerID="1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd" exitCode=0 Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.258701 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"713bb2e1-4bf8-44bf-8a4a-65e0f0def418","Type":"ContainerDied","Data":"1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd"} Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.258738 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"713bb2e1-4bf8-44bf-8a4a-65e0f0def418","Type":"ContainerDied","Data":"8d7c5b1d943fc78a6ecaa422ed7e3a672397c803b70ee441060b14cedbb46c29"} Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.258754 4773 scope.go:117] "RemoveContainer" containerID="2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.258880 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.295564 4773 scope.go:117] "RemoveContainer" containerID="1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.316532 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.330929 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.340786 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 12 21:19:56 crc kubenswrapper[4773]: E1012 21:19:56.341238 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerName="manila-scheduler" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.341251 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerName="manila-scheduler" Oct 12 21:19:56 crc kubenswrapper[4773]: E1012 21:19:56.341273 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerName="probe" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.341279 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerName="probe" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.341470 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerName="manila-scheduler" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.341501 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" containerName="probe" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.342517 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.346993 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.348955 4773 scope.go:117] "RemoveContainer" containerID="2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99" Oct 12 21:19:56 crc kubenswrapper[4773]: E1012 21:19:56.349334 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99\": container with ID starting with 2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99 not found: ID does not exist" containerID="2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.349357 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99"} err="failed to get container status \"2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99\": rpc error: code = NotFound desc = could not find container \"2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99\": container with ID starting with 2ce71618b55339f121e4cbfaf880e31250aeedee38b240cc23bd8931a9ac6f99 not found: ID does not exist" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.349378 4773 scope.go:117] "RemoveContainer" containerID="1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd" Oct 12 21:19:56 crc kubenswrapper[4773]: E1012 21:19:56.349556 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd\": container with ID starting with 1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd not found: ID does not exist" containerID="1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.349572 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd"} err="failed to get container status \"1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd\": rpc error: code = NotFound desc = could not find container \"1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd\": container with ID starting with 1495e9eb49792848561b93eb479b60e49d6c79c5fbd7b7e46f1a2a39f766cdcd not found: ID does not exist" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.355682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.355750 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfbdaab1-e327-4291-a585-829aa6b81f00-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.355771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-config-data\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.355827 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87llv\" (UniqueName: \"kubernetes.io/projected/dfbdaab1-e327-4291-a585-829aa6b81f00-kube-api-access-87llv\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.355857 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.355903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-scripts\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.369262 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.457779 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfbdaab1-e327-4291-a585-829aa6b81f00-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.457822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-config-data\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.457887 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87llv\" (UniqueName: \"kubernetes.io/projected/dfbdaab1-e327-4291-a585-829aa6b81f00-kube-api-access-87llv\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.457920 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.457947 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-scripts\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.458041 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.458331 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfbdaab1-e327-4291-a585-829aa6b81f00-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.463267 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-config-data\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.464838 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.464875 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-scripts\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.468263 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfbdaab1-e327-4291-a585-829aa6b81f00-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.477878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87llv\" (UniqueName: \"kubernetes.io/projected/dfbdaab1-e327-4291-a585-829aa6b81f00-kube-api-access-87llv\") pod \"manila-scheduler-0\" (UID: \"dfbdaab1-e327-4291-a585-829aa6b81f00\") " pod="openstack/manila-scheduler-0" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.496267 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713bb2e1-4bf8-44bf-8a4a-65e0f0def418" path="/var/lib/kubelet/pods/713bb2e1-4bf8-44bf-8a4a-65e0f0def418/volumes" Oct 12 21:19:56 crc kubenswrapper[4773]: I1012 21:19:56.706730 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 12 21:19:57 crc kubenswrapper[4773]: W1012 21:19:57.206629 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfbdaab1_e327_4291_a585_829aa6b81f00.slice/crio-ce41d4b21ce0bb882ead0419a362b7da4dece08a26e20efad9d7ede211c997c9 WatchSource:0}: Error finding container ce41d4b21ce0bb882ead0419a362b7da4dece08a26e20efad9d7ede211c997c9: Status 404 returned error can't find the container with id ce41d4b21ce0bb882ead0419a362b7da4dece08a26e20efad9d7ede211c997c9 Oct 12 21:19:57 crc kubenswrapper[4773]: I1012 21:19:57.207600 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 12 21:19:57 crc kubenswrapper[4773]: I1012 21:19:57.267653 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"dfbdaab1-e327-4291-a585-829aa6b81f00","Type":"ContainerStarted","Data":"ce41d4b21ce0bb882ead0419a362b7da4dece08a26e20efad9d7ede211c997c9"} Oct 12 21:19:58 crc kubenswrapper[4773]: I1012 21:19:58.278954 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"dfbdaab1-e327-4291-a585-829aa6b81f00","Type":"ContainerStarted","Data":"c0517f3c2f64fbf58c6f55faaf13a4459d85d0bac1a360a3de4bbc40314f3009"} Oct 12 21:19:58 crc kubenswrapper[4773]: I1012 21:19:58.279216 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"dfbdaab1-e327-4291-a585-829aa6b81f00","Type":"ContainerStarted","Data":"1df6884f5b3b651fc702378764ca1e755c93420b3576d523f1e7c0203a7dc61b"} Oct 12 21:19:58 crc kubenswrapper[4773]: I1012 21:19:58.340663 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.340642196 podStartE2EDuration="2.340642196s" podCreationTimestamp="2025-10-12 21:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:19:58.314004552 +0000 UTC m=+3346.550303112" watchObservedRunningTime="2025-10-12 21:19:58.340642196 +0000 UTC m=+3346.576940756" Oct 12 21:20:01 crc kubenswrapper[4773]: I1012 21:20:01.118840 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 12 21:20:01 crc kubenswrapper[4773]: I1012 21:20:01.189163 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 12 21:20:01 crc kubenswrapper[4773]: I1012 21:20:01.318007 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerName="manila-share" containerID="cri-o://4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece" gracePeriod=30 Oct 12 21:20:01 crc kubenswrapper[4773]: I1012 21:20:01.318045 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerName="probe" containerID="cri-o://a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d" gracePeriod=30 Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.208149 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-748d9fcc4-b9gp4" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.307500 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.345663 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.345689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"715c75a4-941f-47e6-9ef1-9c490c0afbbd","Type":"ContainerDied","Data":"a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d"} Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.346848 4773 scope.go:117] "RemoveContainer" containerID="a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.345518 4773 generic.go:334] "Generic (PLEG): container finished" podID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerID="a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d" exitCode=0 Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.347171 4773 generic.go:334] "Generic (PLEG): container finished" podID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerID="4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece" exitCode=1 Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.347203 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"715c75a4-941f-47e6-9ef1-9c490c0afbbd","Type":"ContainerDied","Data":"4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece"} Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.347240 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"715c75a4-941f-47e6-9ef1-9c490c0afbbd","Type":"ContainerDied","Data":"d6c368a62a5a15c7b9ba2e42158042e3f829e9d216025d1a2d8c61efc287492b"} Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.375499 4773 scope.go:117] "RemoveContainer" containerID="4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.422110 4773 scope.go:117] "RemoveContainer" containerID="a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d" Oct 12 21:20:02 crc kubenswrapper[4773]: E1012 21:20:02.422597 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d\": container with ID starting with a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d not found: ID does not exist" containerID="a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.422643 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d"} err="failed to get container status \"a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d\": rpc error: code = NotFound desc = could not find container \"a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d\": container with ID starting with a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d not found: ID does not exist" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.422666 4773 scope.go:117] "RemoveContainer" containerID="4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece" Oct 12 21:20:02 crc kubenswrapper[4773]: E1012 21:20:02.423204 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece\": container with ID starting with 4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece not found: ID does not exist" containerID="4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.423267 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece"} err="failed to get container status \"4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece\": rpc error: code = NotFound desc = could not find container \"4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece\": container with ID starting with 4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece not found: ID does not exist" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.423425 4773 scope.go:117] "RemoveContainer" containerID="a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.426548 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d"} err="failed to get container status \"a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d\": rpc error: code = NotFound desc = could not find container \"a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d\": container with ID starting with a76eed4c0cee04df6b1a7924f29d5d6bcce89ffbaff603979a289ff8b794ef0d not found: ID does not exist" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.426577 4773 scope.go:117] "RemoveContainer" containerID="4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.426820 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece"} err="failed to get container status \"4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece\": rpc error: code = NotFound desc = could not find container \"4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece\": container with ID starting with 4a8a1558254dd6891d5333e22c0265068ce80acc988cc3de2ac3f8ce154f9ece not found: ID does not exist" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.470437 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data-custom\") pod \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.470528 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-etc-machine-id\") pod \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.470645 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-var-lib-manila\") pod \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.470691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-scripts\") pod \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.470685 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "715c75a4-941f-47e6-9ef1-9c490c0afbbd" (UID: "715c75a4-941f-47e6-9ef1-9c490c0afbbd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.470781 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbsqn\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-kube-api-access-vbsqn\") pod \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.470799 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "715c75a4-941f-47e6-9ef1-9c490c0afbbd" (UID: "715c75a4-941f-47e6-9ef1-9c490c0afbbd"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.470906 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data\") pod \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.470964 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-combined-ca-bundle\") pod \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.471050 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-ceph\") pod \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\" (UID: \"715c75a4-941f-47e6-9ef1-9c490c0afbbd\") " Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.473950 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.473971 4773 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/715c75a4-941f-47e6-9ef1-9c490c0afbbd-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.477270 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-ceph" (OuterVolumeSpecName: "ceph") pod "715c75a4-941f-47e6-9ef1-9c490c0afbbd" (UID: "715c75a4-941f-47e6-9ef1-9c490c0afbbd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.478920 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-kube-api-access-vbsqn" (OuterVolumeSpecName: "kube-api-access-vbsqn") pod "715c75a4-941f-47e6-9ef1-9c490c0afbbd" (UID: "715c75a4-941f-47e6-9ef1-9c490c0afbbd"). InnerVolumeSpecName "kube-api-access-vbsqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.479035 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "715c75a4-941f-47e6-9ef1-9c490c0afbbd" (UID: "715c75a4-941f-47e6-9ef1-9c490c0afbbd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.492235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-scripts" (OuterVolumeSpecName: "scripts") pod "715c75a4-941f-47e6-9ef1-9c490c0afbbd" (UID: "715c75a4-941f-47e6-9ef1-9c490c0afbbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.533431 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "715c75a4-941f-47e6-9ef1-9c490c0afbbd" (UID: "715c75a4-941f-47e6-9ef1-9c490c0afbbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.575955 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.575987 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-ceph\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.575999 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.576007 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.576015 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbsqn\" (UniqueName: \"kubernetes.io/projected/715c75a4-941f-47e6-9ef1-9c490c0afbbd-kube-api-access-vbsqn\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.581187 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data" (OuterVolumeSpecName: "config-data") pod "715c75a4-941f-47e6-9ef1-9c490c0afbbd" (UID: "715c75a4-941f-47e6-9ef1-9c490c0afbbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.678226 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715c75a4-941f-47e6-9ef1-9c490c0afbbd-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.683035 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.696653 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.710664 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 12 21:20:02 crc kubenswrapper[4773]: E1012 21:20:02.711252 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerName="probe" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.711273 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerName="probe" Oct 12 21:20:02 crc kubenswrapper[4773]: E1012 21:20:02.711293 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerName="manila-share" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.711300 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerName="manila-share" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.711460 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerName="manila-share" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.711489 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" containerName="probe" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.712380 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.714983 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.728199 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.780636 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-config-data\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.780751 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5db237cf-3d2a-48e1-bf07-a92ae2d96139-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.780780 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5db237cf-3d2a-48e1-bf07-a92ae2d96139-ceph\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.780808 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkr5m\" (UniqueName: \"kubernetes.io/projected/5db237cf-3d2a-48e1-bf07-a92ae2d96139-kube-api-access-mkr5m\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.780893 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.780971 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.780997 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5db237cf-3d2a-48e1-bf07-a92ae2d96139-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.781069 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-scripts\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883026 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5db237cf-3d2a-48e1-bf07-a92ae2d96139-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883100 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5db237cf-3d2a-48e1-bf07-a92ae2d96139-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883158 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-scripts\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883207 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-config-data\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883244 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5db237cf-3d2a-48e1-bf07-a92ae2d96139-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5db237cf-3d2a-48e1-bf07-a92ae2d96139-ceph\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883276 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkr5m\" (UniqueName: \"kubernetes.io/projected/5db237cf-3d2a-48e1-bf07-a92ae2d96139-kube-api-access-mkr5m\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883323 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883376 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.883929 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5db237cf-3d2a-48e1-bf07-a92ae2d96139-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.888305 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.888788 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-scripts\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.890592 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-config-data\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.890747 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5db237cf-3d2a-48e1-bf07-a92ae2d96139-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.895858 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5db237cf-3d2a-48e1-bf07-a92ae2d96139-ceph\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:02 crc kubenswrapper[4773]: I1012 21:20:02.903422 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkr5m\" (UniqueName: \"kubernetes.io/projected/5db237cf-3d2a-48e1-bf07-a92ae2d96139-kube-api-access-mkr5m\") pod \"manila-share-share1-0\" (UID: \"5db237cf-3d2a-48e1-bf07-a92ae2d96139\") " pod="openstack/manila-share-share1-0" Oct 12 21:20:03 crc kubenswrapper[4773]: I1012 21:20:03.091372 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 12 21:20:03 crc kubenswrapper[4773]: I1012 21:20:03.675139 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 12 21:20:04 crc kubenswrapper[4773]: I1012 21:20:04.393210 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"5db237cf-3d2a-48e1-bf07-a92ae2d96139","Type":"ContainerStarted","Data":"7978538ab39a5441a39e75001095a1ad6154e5755e474191bb306cbc38138f03"} Oct 12 21:20:04 crc kubenswrapper[4773]: I1012 21:20:04.393798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"5db237cf-3d2a-48e1-bf07-a92ae2d96139","Type":"ContainerStarted","Data":"093e1c9813d99c97aba14060ea808f614aba285f1c2f4fa80b24c9558d34c74f"} Oct 12 21:20:04 crc kubenswrapper[4773]: I1012 21:20:04.497883 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715c75a4-941f-47e6-9ef1-9c490c0afbbd" path="/var/lib/kubelet/pods/715c75a4-941f-47e6-9ef1-9c490c0afbbd/volumes" Oct 12 21:20:05 crc kubenswrapper[4773]: I1012 21:20:05.411477 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"5db237cf-3d2a-48e1-bf07-a92ae2d96139","Type":"ContainerStarted","Data":"0a54ff738cf23aa2dec7870ceee66338b193c596732d774d882849ebc0c4f39a"} Oct 12 21:20:05 crc kubenswrapper[4773]: I1012 21:20:05.440458 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.440442522 podStartE2EDuration="3.440442522s" podCreationTimestamp="2025-10-12 21:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:20:05.427084263 +0000 UTC m=+3353.663382833" watchObservedRunningTime="2025-10-12 21:20:05.440442522 +0000 UTC m=+3353.676741082" Oct 12 21:20:06 crc kubenswrapper[4773]: I1012 21:20:06.707353 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 12 21:20:12 crc kubenswrapper[4773]: I1012 21:20:12.208082 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-748d9fcc4-b9gp4" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Oct 12 21:20:12 crc kubenswrapper[4773]: I1012 21:20:12.208612 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:20:13 crc kubenswrapper[4773]: I1012 21:20:13.091658 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 12 21:20:17 crc kubenswrapper[4773]: I1012 21:20:17.589702 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 12 21:20:18 crc kubenswrapper[4773]: I1012 21:20:18.144993 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 12 21:20:18 crc kubenswrapper[4773]: I1012 21:20:18.545689 4773 generic.go:334] "Generic (PLEG): container finished" podID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerID="592c0b7b3acf5884f2e1c8fc27ff80b95861279d02e3f59c8127b237c90c16a6" exitCode=137 Oct 12 21:20:18 crc kubenswrapper[4773]: I1012 21:20:18.545742 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-748d9fcc4-b9gp4" event={"ID":"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb","Type":"ContainerDied","Data":"592c0b7b3acf5884f2e1c8fc27ff80b95861279d02e3f59c8127b237c90c16a6"} Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.079250 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.228650 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-config-data\") pod \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.228750 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-tls-certs\") pod \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.228831 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g7pf\" (UniqueName: \"kubernetes.io/projected/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-kube-api-access-5g7pf\") pod \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.228935 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-secret-key\") pod \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.228965 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-scripts\") pod \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.229002 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-combined-ca-bundle\") pod \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.229084 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-logs\") pod \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\" (UID: \"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb\") " Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.229967 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-logs" (OuterVolumeSpecName: "logs") pod "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" (UID: "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.235643 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-kube-api-access-5g7pf" (OuterVolumeSpecName: "kube-api-access-5g7pf") pod "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" (UID: "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb"). InnerVolumeSpecName "kube-api-access-5g7pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.236255 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" (UID: "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.266307 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-scripts" (OuterVolumeSpecName: "scripts") pod "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" (UID: "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.266620 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-config-data" (OuterVolumeSpecName: "config-data") pod "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" (UID: "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.269358 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" (UID: "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.303638 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" (UID: "fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.331175 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.331449 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g7pf\" (UniqueName: \"kubernetes.io/projected/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-kube-api-access-5g7pf\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.331462 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.331472 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.331482 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.331491 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-logs\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.331501 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.556244 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-748d9fcc4-b9gp4" event={"ID":"fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb","Type":"ContainerDied","Data":"56d0d31d9f111798d6fed36aa0b8ee8278251563cd1c9a1ade5d6fbf0b9db604"} Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.556296 4773 scope.go:117] "RemoveContainer" containerID="9139515c9bc911481c56d15cc53d5964d7bd8c93d7db9cb8e171178992bbc3d7" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.556532 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-748d9fcc4-b9gp4" Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.590556 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-748d9fcc4-b9gp4"] Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.597985 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-748d9fcc4-b9gp4"] Oct 12 21:20:19 crc kubenswrapper[4773]: I1012 21:20:19.727132 4773 scope.go:117] "RemoveContainer" containerID="592c0b7b3acf5884f2e1c8fc27ff80b95861279d02e3f59c8127b237c90c16a6" Oct 12 21:20:20 crc kubenswrapper[4773]: I1012 21:20:20.503982 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" path="/var/lib/kubelet/pods/fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb/volumes" Oct 12 21:20:24 crc kubenswrapper[4773]: I1012 21:20:24.541739 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 12 21:20:28 crc kubenswrapper[4773]: I1012 21:20:28.669643 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:20:28 crc kubenswrapper[4773]: I1012 21:20:28.670533 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:20:35 crc kubenswrapper[4773]: I1012 21:20:35.133529 4773 scope.go:117] "RemoveContainer" containerID="aa1e086830698327baacb3bcc8ba6cc870bad45db1e358f875b1e9cf8557983e" Oct 12 21:20:35 crc kubenswrapper[4773]: I1012 21:20:35.167184 4773 scope.go:117] "RemoveContainer" containerID="d4e762b832d3199b3de4c1783405eff4b7589cd7cedf16bfb4917df1381c8e26" Oct 12 21:20:58 crc kubenswrapper[4773]: I1012 21:20:58.669744 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:20:58 crc kubenswrapper[4773]: I1012 21:20:58.670192 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:21:28 crc kubenswrapper[4773]: I1012 21:21:28.670070 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:21:28 crc kubenswrapper[4773]: I1012 21:21:28.670611 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:21:28 crc kubenswrapper[4773]: I1012 21:21:28.670662 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 21:21:28 crc kubenswrapper[4773]: I1012 21:21:28.671689 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b08f176c669e3d17182f07bb2f773fddf9422828820154afb12dfceb24b5defd"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 21:21:28 crc kubenswrapper[4773]: I1012 21:21:28.671782 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://b08f176c669e3d17182f07bb2f773fddf9422828820154afb12dfceb24b5defd" gracePeriod=600 Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.229285 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 12 21:21:29 crc kubenswrapper[4773]: E1012 21:21:29.229951 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.229964 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon" Oct 12 21:21:29 crc kubenswrapper[4773]: E1012 21:21:29.229992 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon-log" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.229999 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon-log" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.230188 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon-log" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.230202 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf116e1-a9e0-44e1-bd21-bcbfe2c365eb" containerName="horizon" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.230868 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.241015 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.241098 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.241406 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.241668 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-q5v5g" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.243963 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.247863 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="b08f176c669e3d17182f07bb2f773fddf9422828820154afb12dfceb24b5defd" exitCode=0 Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.247910 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"b08f176c669e3d17182f07bb2f773fddf9422828820154afb12dfceb24b5defd"} Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.247940 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5"} Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.247956 4773 scope.go:117] "RemoveContainer" containerID="c248e40c5d787f0b0d93b126bf352c3b6bedb3e6a081d45a568c3fe7e8cc6e4e" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.313327 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-config-data\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.313395 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.313642 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.313859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.314048 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp42w\" (UniqueName: \"kubernetes.io/projected/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-kube-api-access-tp42w\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.314235 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.314317 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.314504 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.314550 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.416707 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.416802 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.416879 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.416905 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.416931 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-config-data\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.416967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.416990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.417043 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.417178 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp42w\" (UniqueName: \"kubernetes.io/projected/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-kube-api-access-tp42w\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.418227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-config-data\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.418578 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.419291 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.419391 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.419528 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.425050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.425204 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.442107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp42w\" (UniqueName: \"kubernetes.io/projected/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-kube-api-access-tp42w\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.443421 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.468296 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " pod="openstack/tempest-tests-tempest" Oct 12 21:21:29 crc kubenswrapper[4773]: I1012 21:21:29.552629 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 21:21:30 crc kubenswrapper[4773]: I1012 21:21:30.074138 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 12 21:21:30 crc kubenswrapper[4773]: W1012 21:21:30.078289 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d6f8e69_6323_4d99_bdd3_7bb8ca4e6345.slice/crio-e6fa5d69924b08e6320ffb9859d922e94c7070e16d95d1c4597094e932d06a1e WatchSource:0}: Error finding container e6fa5d69924b08e6320ffb9859d922e94c7070e16d95d1c4597094e932d06a1e: Status 404 returned error can't find the container with id e6fa5d69924b08e6320ffb9859d922e94c7070e16d95d1c4597094e932d06a1e Oct 12 21:21:30 crc kubenswrapper[4773]: I1012 21:21:30.258961 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345","Type":"ContainerStarted","Data":"e6fa5d69924b08e6320ffb9859d922e94c7070e16d95d1c4597094e932d06a1e"} Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.284054 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-thm6v"] Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.287318 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.296087 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thm6v"] Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.409515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-utilities\") pod \"certified-operators-thm6v\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.409635 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5h5j\" (UniqueName: \"kubernetes.io/projected/89cf4b22-0f73-4514-9c53-01e2b9978846-kube-api-access-r5h5j\") pod \"certified-operators-thm6v\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.409663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-catalog-content\") pod \"certified-operators-thm6v\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.511919 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-utilities\") pod \"certified-operators-thm6v\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.512023 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5h5j\" (UniqueName: \"kubernetes.io/projected/89cf4b22-0f73-4514-9c53-01e2b9978846-kube-api-access-r5h5j\") pod \"certified-operators-thm6v\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.512046 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-catalog-content\") pod \"certified-operators-thm6v\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.512507 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-catalog-content\") pod \"certified-operators-thm6v\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.512744 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-utilities\") pod \"certified-operators-thm6v\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.543188 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5h5j\" (UniqueName: \"kubernetes.io/projected/89cf4b22-0f73-4514-9c53-01e2b9978846-kube-api-access-r5h5j\") pod \"certified-operators-thm6v\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:32 crc kubenswrapper[4773]: I1012 21:21:32.622359 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:21:33 crc kubenswrapper[4773]: I1012 21:21:33.152074 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thm6v"] Oct 12 21:21:33 crc kubenswrapper[4773]: I1012 21:21:33.345517 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thm6v" event={"ID":"89cf4b22-0f73-4514-9c53-01e2b9978846","Type":"ContainerStarted","Data":"0ebabbf099efbeb65d47981e2cace4dabefb1cff3bee1d0a142e21a211e14867"} Oct 12 21:21:34 crc kubenswrapper[4773]: I1012 21:21:34.355062 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thm6v" event={"ID":"89cf4b22-0f73-4514-9c53-01e2b9978846","Type":"ContainerDied","Data":"5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b"} Oct 12 21:21:34 crc kubenswrapper[4773]: I1012 21:21:34.354980 4773 generic.go:334] "Generic (PLEG): container finished" podID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerID="5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b" exitCode=0 Oct 12 21:21:35 crc kubenswrapper[4773]: I1012 21:21:35.359920 4773 scope.go:117] "RemoveContainer" containerID="d467c418e06b3883ff7c8ba1b9ddda046929ae0dc74b9c61a5ce726e5c08dcc9" Oct 12 21:21:36 crc kubenswrapper[4773]: I1012 21:21:36.878294 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4dngv"] Oct 12 21:21:36 crc kubenswrapper[4773]: I1012 21:21:36.883333 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:36 crc kubenswrapper[4773]: I1012 21:21:36.889742 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dngv"] Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.024354 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-utilities\") pod \"redhat-marketplace-4dngv\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.024412 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffl7t\" (UniqueName: \"kubernetes.io/projected/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-kube-api-access-ffl7t\") pod \"redhat-marketplace-4dngv\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.024513 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-catalog-content\") pod \"redhat-marketplace-4dngv\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.126221 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-utilities\") pod \"redhat-marketplace-4dngv\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.126261 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffl7t\" (UniqueName: \"kubernetes.io/projected/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-kube-api-access-ffl7t\") pod \"redhat-marketplace-4dngv\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.126385 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-catalog-content\") pod \"redhat-marketplace-4dngv\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.126779 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-utilities\") pod \"redhat-marketplace-4dngv\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.126845 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-catalog-content\") pod \"redhat-marketplace-4dngv\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.173022 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffl7t\" (UniqueName: \"kubernetes.io/projected/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-kube-api-access-ffl7t\") pod \"redhat-marketplace-4dngv\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:37 crc kubenswrapper[4773]: I1012 21:21:37.222024 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:21:47 crc kubenswrapper[4773]: I1012 21:21:47.472283 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dngv"] Oct 12 21:21:47 crc kubenswrapper[4773]: I1012 21:21:47.547897 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thm6v" event={"ID":"89cf4b22-0f73-4514-9c53-01e2b9978846","Type":"ContainerStarted","Data":"7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913"} Oct 12 21:21:54 crc kubenswrapper[4773]: I1012 21:21:54.617921 4773 generic.go:334] "Generic (PLEG): container finished" podID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerID="7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913" exitCode=0 Oct 12 21:21:54 crc kubenswrapper[4773]: I1012 21:21:54.618010 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thm6v" event={"ID":"89cf4b22-0f73-4514-9c53-01e2b9978846","Type":"ContainerDied","Data":"7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913"} Oct 12 21:22:26 crc kubenswrapper[4773]: E1012 21:22:26.040327 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 12 21:22:26 crc kubenswrapper[4773]: E1012 21:22:26.046197 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tp42w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 21:22:26 crc kubenswrapper[4773]: E1012 21:22:26.047802 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" Oct 12 21:22:26 crc kubenswrapper[4773]: I1012 21:22:26.919733 4773 generic.go:334] "Generic (PLEG): container finished" podID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerID="d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837" exitCode=0 Oct 12 21:22:26 crc kubenswrapper[4773]: I1012 21:22:26.919788 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dngv" event={"ID":"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9","Type":"ContainerDied","Data":"d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837"} Oct 12 21:22:26 crc kubenswrapper[4773]: I1012 21:22:26.920015 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dngv" event={"ID":"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9","Type":"ContainerStarted","Data":"821cb4fa8a54ff120fae101e62a49e0a0318c3d11e51adaef8eddad5fc4c569e"} Oct 12 21:22:26 crc kubenswrapper[4773]: I1012 21:22:26.925282 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thm6v" event={"ID":"89cf4b22-0f73-4514-9c53-01e2b9978846","Type":"ContainerStarted","Data":"5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6"} Oct 12 21:22:26 crc kubenswrapper[4773]: E1012 21:22:26.927988 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" Oct 12 21:22:27 crc kubenswrapper[4773]: I1012 21:22:27.035894 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-thm6v" podStartSLOduration=5.365112757 podStartE2EDuration="55.035879709s" podCreationTimestamp="2025-10-12 21:21:32 +0000 UTC" firstStartedPulling="2025-10-12 21:21:36.714608523 +0000 UTC m=+3444.950907083" lastFinishedPulling="2025-10-12 21:22:26.385375475 +0000 UTC m=+3494.621674035" observedRunningTime="2025-10-12 21:22:27.033239386 +0000 UTC m=+3495.269537946" watchObservedRunningTime="2025-10-12 21:22:27.035879709 +0000 UTC m=+3495.272178269" Oct 12 21:22:27 crc kubenswrapper[4773]: I1012 21:22:27.938478 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dngv" event={"ID":"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9","Type":"ContainerStarted","Data":"74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc"} Oct 12 21:22:28 crc kubenswrapper[4773]: I1012 21:22:28.963397 4773 generic.go:334] "Generic (PLEG): container finished" podID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerID="74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc" exitCode=0 Oct 12 21:22:28 crc kubenswrapper[4773]: I1012 21:22:28.963523 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dngv" event={"ID":"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9","Type":"ContainerDied","Data":"74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc"} Oct 12 21:22:29 crc kubenswrapper[4773]: I1012 21:22:29.973604 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dngv" event={"ID":"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9","Type":"ContainerStarted","Data":"39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a"} Oct 12 21:22:29 crc kubenswrapper[4773]: I1012 21:22:29.993397 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4dngv" podStartSLOduration=51.471038425 podStartE2EDuration="53.993355929s" podCreationTimestamp="2025-10-12 21:21:36 +0000 UTC" firstStartedPulling="2025-10-12 21:22:26.921594708 +0000 UTC m=+3495.157893268" lastFinishedPulling="2025-10-12 21:22:29.443911942 +0000 UTC m=+3497.680210772" observedRunningTime="2025-10-12 21:22:29.99336126 +0000 UTC m=+3498.229659860" watchObservedRunningTime="2025-10-12 21:22:29.993355929 +0000 UTC m=+3498.229654489" Oct 12 21:22:32 crc kubenswrapper[4773]: I1012 21:22:32.623424 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:22:32 crc kubenswrapper[4773]: I1012 21:22:32.623768 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:22:33 crc kubenswrapper[4773]: I1012 21:22:33.674927 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-thm6v" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerName="registry-server" probeResult="failure" output=< Oct 12 21:22:33 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:22:33 crc kubenswrapper[4773]: > Oct 12 21:22:37 crc kubenswrapper[4773]: I1012 21:22:37.222892 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:22:37 crc kubenswrapper[4773]: I1012 21:22:37.225271 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:22:37 crc kubenswrapper[4773]: I1012 21:22:37.309548 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:22:38 crc kubenswrapper[4773]: I1012 21:22:38.108534 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:22:38 crc kubenswrapper[4773]: I1012 21:22:38.195334 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dngv"] Oct 12 21:22:39 crc kubenswrapper[4773]: I1012 21:22:39.952412 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.057474 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4dngv" podUID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerName="registry-server" containerID="cri-o://39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a" gracePeriod=2 Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.523431 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.681116 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-utilities\") pod \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.681154 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffl7t\" (UniqueName: \"kubernetes.io/projected/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-kube-api-access-ffl7t\") pod \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.681290 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-catalog-content\") pod \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\" (UID: \"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9\") " Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.682010 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-utilities" (OuterVolumeSpecName: "utilities") pod "2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" (UID: "2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.686864 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-kube-api-access-ffl7t" (OuterVolumeSpecName: "kube-api-access-ffl7t") pod "2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" (UID: "2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9"). InnerVolumeSpecName "kube-api-access-ffl7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.696299 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" (UID: "2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.783621 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.783654 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffl7t\" (UniqueName: \"kubernetes.io/projected/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-kube-api-access-ffl7t\") on node \"crc\" DevicePath \"\"" Oct 12 21:22:40 crc kubenswrapper[4773]: I1012 21:22:40.783692 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.067821 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345","Type":"ContainerStarted","Data":"31097c067ff5a7b3a735df9f1ca4d9bbde1cbb93f572dcceebb5f2fd754ce2de"} Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.070927 4773 generic.go:334] "Generic (PLEG): container finished" podID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerID="39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a" exitCode=0 Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.070966 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dngv" event={"ID":"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9","Type":"ContainerDied","Data":"39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a"} Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.070991 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dngv" event={"ID":"2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9","Type":"ContainerDied","Data":"821cb4fa8a54ff120fae101e62a49e0a0318c3d11e51adaef8eddad5fc4c569e"} Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.071008 4773 scope.go:117] "RemoveContainer" containerID="39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.071081 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dngv" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.096673 4773 scope.go:117] "RemoveContainer" containerID="74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.103821 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.235401864 podStartE2EDuration="1m13.10380503s" podCreationTimestamp="2025-10-12 21:21:28 +0000 UTC" firstStartedPulling="2025-10-12 21:21:30.080853745 +0000 UTC m=+3438.317152295" lastFinishedPulling="2025-10-12 21:22:39.949256901 +0000 UTC m=+3508.185555461" observedRunningTime="2025-10-12 21:22:41.088041775 +0000 UTC m=+3509.324340335" watchObservedRunningTime="2025-10-12 21:22:41.10380503 +0000 UTC m=+3509.340103580" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.123435 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dngv"] Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.144445 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dngv"] Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.148919 4773 scope.go:117] "RemoveContainer" containerID="d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.189623 4773 scope.go:117] "RemoveContainer" containerID="39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a" Oct 12 21:22:41 crc kubenswrapper[4773]: E1012 21:22:41.191239 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a\": container with ID starting with 39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a not found: ID does not exist" containerID="39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.191295 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a"} err="failed to get container status \"39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a\": rpc error: code = NotFound desc = could not find container \"39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a\": container with ID starting with 39099a671ce138625c8a32f267996040f4509d788e79bbc3f153eacaf5efb72a not found: ID does not exist" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.191330 4773 scope.go:117] "RemoveContainer" containerID="74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc" Oct 12 21:22:41 crc kubenswrapper[4773]: E1012 21:22:41.191685 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc\": container with ID starting with 74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc not found: ID does not exist" containerID="74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.191742 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc"} err="failed to get container status \"74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc\": rpc error: code = NotFound desc = could not find container \"74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc\": container with ID starting with 74c7effd0951aeb39435004ca279d0cee2fb098286fefa493904f4c340fe30dc not found: ID does not exist" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.191774 4773 scope.go:117] "RemoveContainer" containerID="d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837" Oct 12 21:22:41 crc kubenswrapper[4773]: E1012 21:22:41.192255 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837\": container with ID starting with d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837 not found: ID does not exist" containerID="d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837" Oct 12 21:22:41 crc kubenswrapper[4773]: I1012 21:22:41.192300 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837"} err="failed to get container status \"d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837\": rpc error: code = NotFound desc = could not find container \"d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837\": container with ID starting with d336fbbe3b1132dc68eb828dda692b6476e535e3295a848b84aed0decc451837 not found: ID does not exist" Oct 12 21:22:42 crc kubenswrapper[4773]: I1012 21:22:42.500920 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" path="/var/lib/kubelet/pods/2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9/volumes" Oct 12 21:22:42 crc kubenswrapper[4773]: I1012 21:22:42.680777 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:22:42 crc kubenswrapper[4773]: I1012 21:22:42.728851 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:22:43 crc kubenswrapper[4773]: I1012 21:22:43.965425 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-thm6v"] Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.108562 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-thm6v" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerName="registry-server" containerID="cri-o://5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6" gracePeriod=2 Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.578105 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.666609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-catalog-content\") pod \"89cf4b22-0f73-4514-9c53-01e2b9978846\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.666711 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-utilities\") pod \"89cf4b22-0f73-4514-9c53-01e2b9978846\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.666868 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5h5j\" (UniqueName: \"kubernetes.io/projected/89cf4b22-0f73-4514-9c53-01e2b9978846-kube-api-access-r5h5j\") pod \"89cf4b22-0f73-4514-9c53-01e2b9978846\" (UID: \"89cf4b22-0f73-4514-9c53-01e2b9978846\") " Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.667569 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-utilities" (OuterVolumeSpecName: "utilities") pod "89cf4b22-0f73-4514-9c53-01e2b9978846" (UID: "89cf4b22-0f73-4514-9c53-01e2b9978846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.680078 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89cf4b22-0f73-4514-9c53-01e2b9978846-kube-api-access-r5h5j" (OuterVolumeSpecName: "kube-api-access-r5h5j") pod "89cf4b22-0f73-4514-9c53-01e2b9978846" (UID: "89cf4b22-0f73-4514-9c53-01e2b9978846"). InnerVolumeSpecName "kube-api-access-r5h5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.709274 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89cf4b22-0f73-4514-9c53-01e2b9978846" (UID: "89cf4b22-0f73-4514-9c53-01e2b9978846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.769067 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.769102 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89cf4b22-0f73-4514-9c53-01e2b9978846-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:22:44 crc kubenswrapper[4773]: I1012 21:22:44.769111 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5h5j\" (UniqueName: \"kubernetes.io/projected/89cf4b22-0f73-4514-9c53-01e2b9978846-kube-api-access-r5h5j\") on node \"crc\" DevicePath \"\"" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.118225 4773 generic.go:334] "Generic (PLEG): container finished" podID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerID="5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6" exitCode=0 Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.118283 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thm6v" event={"ID":"89cf4b22-0f73-4514-9c53-01e2b9978846","Type":"ContainerDied","Data":"5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6"} Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.118371 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thm6v" event={"ID":"89cf4b22-0f73-4514-9c53-01e2b9978846","Type":"ContainerDied","Data":"0ebabbf099efbeb65d47981e2cace4dabefb1cff3bee1d0a142e21a211e14867"} Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.118401 4773 scope.go:117] "RemoveContainer" containerID="5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.119560 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thm6v" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.158584 4773 scope.go:117] "RemoveContainer" containerID="7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.162879 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-thm6v"] Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.172859 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-thm6v"] Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.191364 4773 scope.go:117] "RemoveContainer" containerID="5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.234144 4773 scope.go:117] "RemoveContainer" containerID="5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6" Oct 12 21:22:45 crc kubenswrapper[4773]: E1012 21:22:45.234677 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6\": container with ID starting with 5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6 not found: ID does not exist" containerID="5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.234732 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6"} err="failed to get container status \"5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6\": rpc error: code = NotFound desc = could not find container \"5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6\": container with ID starting with 5a5094ac98651de198fd905f9c4463420d5cebf8819c6c9859fea75020456fb6 not found: ID does not exist" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.234760 4773 scope.go:117] "RemoveContainer" containerID="7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913" Oct 12 21:22:45 crc kubenswrapper[4773]: E1012 21:22:45.235241 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913\": container with ID starting with 7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913 not found: ID does not exist" containerID="7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.235279 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913"} err="failed to get container status \"7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913\": rpc error: code = NotFound desc = could not find container \"7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913\": container with ID starting with 7d16ff557485812807d5d2f8ec92515105e30cee20225853fc0a4f433e63f913 not found: ID does not exist" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.235305 4773 scope.go:117] "RemoveContainer" containerID="5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b" Oct 12 21:22:45 crc kubenswrapper[4773]: E1012 21:22:45.235742 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b\": container with ID starting with 5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b not found: ID does not exist" containerID="5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b" Oct 12 21:22:45 crc kubenswrapper[4773]: I1012 21:22:45.235794 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b"} err="failed to get container status \"5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b\": rpc error: code = NotFound desc = could not find container \"5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b\": container with ID starting with 5b66f0f7d7eb853bc2f06585a7f5e4ba9dc156ed54469666b39604bccb337f6b not found: ID does not exist" Oct 12 21:22:46 crc kubenswrapper[4773]: I1012 21:22:46.492231 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" path="/var/lib/kubelet/pods/89cf4b22-0f73-4514-9c53-01e2b9978846/volumes" Oct 12 21:23:58 crc kubenswrapper[4773]: I1012 21:23:58.669058 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:23:58 crc kubenswrapper[4773]: I1012 21:23:58.669706 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:24:28 crc kubenswrapper[4773]: I1012 21:24:28.669328 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:24:28 crc kubenswrapper[4773]: I1012 21:24:28.670011 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:24:58 crc kubenswrapper[4773]: I1012 21:24:58.669589 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:24:58 crc kubenswrapper[4773]: I1012 21:24:58.670255 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:24:58 crc kubenswrapper[4773]: I1012 21:24:58.670312 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 21:24:58 crc kubenswrapper[4773]: I1012 21:24:58.671362 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 21:24:58 crc kubenswrapper[4773]: I1012 21:24:58.671606 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" gracePeriod=600 Oct 12 21:24:58 crc kubenswrapper[4773]: E1012 21:24:58.806553 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:24:59 crc kubenswrapper[4773]: I1012 21:24:59.555427 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" exitCode=0 Oct 12 21:24:59 crc kubenswrapper[4773]: I1012 21:24:59.555493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5"} Oct 12 21:24:59 crc kubenswrapper[4773]: I1012 21:24:59.555535 4773 scope.go:117] "RemoveContainer" containerID="b08f176c669e3d17182f07bb2f773fddf9422828820154afb12dfceb24b5defd" Oct 12 21:24:59 crc kubenswrapper[4773]: I1012 21:24:59.558763 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:24:59 crc kubenswrapper[4773]: E1012 21:24:59.561327 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:25:13 crc kubenswrapper[4773]: I1012 21:25:13.481882 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:25:13 crc kubenswrapper[4773]: E1012 21:25:13.482790 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.150395 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kgpwf"] Oct 12 21:25:23 crc kubenswrapper[4773]: E1012 21:25:23.151400 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerName="extract-utilities" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.151415 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerName="extract-utilities" Oct 12 21:25:23 crc kubenswrapper[4773]: E1012 21:25:23.151432 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerName="extract-utilities" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.151440 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerName="extract-utilities" Oct 12 21:25:23 crc kubenswrapper[4773]: E1012 21:25:23.151458 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerName="registry-server" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.151466 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerName="registry-server" Oct 12 21:25:23 crc kubenswrapper[4773]: E1012 21:25:23.151501 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerName="registry-server" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.151508 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerName="registry-server" Oct 12 21:25:23 crc kubenswrapper[4773]: E1012 21:25:23.151523 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerName="extract-content" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.151531 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerName="extract-content" Oct 12 21:25:23 crc kubenswrapper[4773]: E1012 21:25:23.151557 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerName="extract-content" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.151565 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerName="extract-content" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.151775 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="89cf4b22-0f73-4514-9c53-01e2b9978846" containerName="registry-server" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.151806 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f3c224b-0b2c-40ab-8fa8-04f48e5bc1d9" containerName="registry-server" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.154168 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.164748 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kgpwf"] Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.336326 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4wxm\" (UniqueName: \"kubernetes.io/projected/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-kube-api-access-l4wxm\") pod \"redhat-operators-kgpwf\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.336512 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-catalog-content\") pod \"redhat-operators-kgpwf\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.336550 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-utilities\") pod \"redhat-operators-kgpwf\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.437901 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-catalog-content\") pod \"redhat-operators-kgpwf\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.437973 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-utilities\") pod \"redhat-operators-kgpwf\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.438016 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4wxm\" (UniqueName: \"kubernetes.io/projected/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-kube-api-access-l4wxm\") pod \"redhat-operators-kgpwf\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.438563 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-utilities\") pod \"redhat-operators-kgpwf\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.438564 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-catalog-content\") pod \"redhat-operators-kgpwf\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.458992 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4wxm\" (UniqueName: \"kubernetes.io/projected/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-kube-api-access-l4wxm\") pod \"redhat-operators-kgpwf\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:23 crc kubenswrapper[4773]: I1012 21:25:23.477072 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:24 crc kubenswrapper[4773]: I1012 21:25:24.007733 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kgpwf"] Oct 12 21:25:24 crc kubenswrapper[4773]: I1012 21:25:24.802275 4773 generic.go:334] "Generic (PLEG): container finished" podID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerID="8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7" exitCode=0 Oct 12 21:25:24 crc kubenswrapper[4773]: I1012 21:25:24.802454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgpwf" event={"ID":"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743","Type":"ContainerDied","Data":"8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7"} Oct 12 21:25:24 crc kubenswrapper[4773]: I1012 21:25:24.802614 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgpwf" event={"ID":"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743","Type":"ContainerStarted","Data":"86be39ccec132af843cc9f42b307cf08789ff89bd28c720fd627e31a769b1626"} Oct 12 21:25:24 crc kubenswrapper[4773]: I1012 21:25:24.807911 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 21:25:26 crc kubenswrapper[4773]: I1012 21:25:26.835391 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgpwf" event={"ID":"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743","Type":"ContainerStarted","Data":"1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1"} Oct 12 21:25:28 crc kubenswrapper[4773]: I1012 21:25:28.480970 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:25:28 crc kubenswrapper[4773]: E1012 21:25:28.481464 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:25:29 crc kubenswrapper[4773]: I1012 21:25:29.870103 4773 generic.go:334] "Generic (PLEG): container finished" podID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerID="1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1" exitCode=0 Oct 12 21:25:29 crc kubenswrapper[4773]: I1012 21:25:29.870168 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgpwf" event={"ID":"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743","Type":"ContainerDied","Data":"1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1"} Oct 12 21:25:30 crc kubenswrapper[4773]: I1012 21:25:30.882193 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgpwf" event={"ID":"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743","Type":"ContainerStarted","Data":"e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd"} Oct 12 21:25:30 crc kubenswrapper[4773]: I1012 21:25:30.916036 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kgpwf" podStartSLOduration=2.430243544 podStartE2EDuration="7.916016255s" podCreationTimestamp="2025-10-12 21:25:23 +0000 UTC" firstStartedPulling="2025-10-12 21:25:24.805307982 +0000 UTC m=+3673.041606542" lastFinishedPulling="2025-10-12 21:25:30.291080693 +0000 UTC m=+3678.527379253" observedRunningTime="2025-10-12 21:25:30.90822821 +0000 UTC m=+3679.144526770" watchObservedRunningTime="2025-10-12 21:25:30.916016255 +0000 UTC m=+3679.152314815" Oct 12 21:25:33 crc kubenswrapper[4773]: I1012 21:25:33.477636 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:33 crc kubenswrapper[4773]: I1012 21:25:33.477979 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:34 crc kubenswrapper[4773]: I1012 21:25:34.524414 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kgpwf" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="registry-server" probeResult="failure" output=< Oct 12 21:25:34 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:25:34 crc kubenswrapper[4773]: > Oct 12 21:25:39 crc kubenswrapper[4773]: I1012 21:25:39.481130 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:25:39 crc kubenswrapper[4773]: E1012 21:25:39.481773 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:25:44 crc kubenswrapper[4773]: I1012 21:25:44.527990 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kgpwf" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="registry-server" probeResult="failure" output=< Oct 12 21:25:44 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:25:44 crc kubenswrapper[4773]: > Oct 12 21:25:52 crc kubenswrapper[4773]: I1012 21:25:52.488888 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:25:52 crc kubenswrapper[4773]: E1012 21:25:52.489708 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:25:53 crc kubenswrapper[4773]: I1012 21:25:53.538371 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:53 crc kubenswrapper[4773]: I1012 21:25:53.617379 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:54 crc kubenswrapper[4773]: I1012 21:25:54.346974 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kgpwf"] Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.100316 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kgpwf" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="registry-server" containerID="cri-o://e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd" gracePeriod=2 Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.619513 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.750703 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-utilities\") pod \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.750811 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-catalog-content\") pod \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.750981 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4wxm\" (UniqueName: \"kubernetes.io/projected/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-kube-api-access-l4wxm\") pod \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\" (UID: \"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743\") " Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.751909 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-utilities" (OuterVolumeSpecName: "utilities") pod "4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" (UID: "4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.757260 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-kube-api-access-l4wxm" (OuterVolumeSpecName: "kube-api-access-l4wxm") pod "4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" (UID: "4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743"). InnerVolumeSpecName "kube-api-access-l4wxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.826222 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" (UID: "4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.853362 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4wxm\" (UniqueName: \"kubernetes.io/projected/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-kube-api-access-l4wxm\") on node \"crc\" DevicePath \"\"" Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.853593 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:25:55 crc kubenswrapper[4773]: I1012 21:25:55.853660 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.111340 4773 generic.go:334] "Generic (PLEG): container finished" podID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerID="e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd" exitCode=0 Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.111447 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgpwf" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.111501 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgpwf" event={"ID":"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743","Type":"ContainerDied","Data":"e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd"} Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.112906 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgpwf" event={"ID":"4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743","Type":"ContainerDied","Data":"86be39ccec132af843cc9f42b307cf08789ff89bd28c720fd627e31a769b1626"} Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.112950 4773 scope.go:117] "RemoveContainer" containerID="e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.154753 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kgpwf"] Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.160453 4773 scope.go:117] "RemoveContainer" containerID="1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.166169 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kgpwf"] Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.189981 4773 scope.go:117] "RemoveContainer" containerID="8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.251461 4773 scope.go:117] "RemoveContainer" containerID="e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd" Oct 12 21:25:56 crc kubenswrapper[4773]: E1012 21:25:56.257185 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd\": container with ID starting with e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd not found: ID does not exist" containerID="e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.257254 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd"} err="failed to get container status \"e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd\": rpc error: code = NotFound desc = could not find container \"e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd\": container with ID starting with e64c61995639f7a0cabc0947cb203381a9535a069618d381d07a45173b93e4dd not found: ID does not exist" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.257284 4773 scope.go:117] "RemoveContainer" containerID="1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1" Oct 12 21:25:56 crc kubenswrapper[4773]: E1012 21:25:56.257579 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1\": container with ID starting with 1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1 not found: ID does not exist" containerID="1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.257602 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1"} err="failed to get container status \"1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1\": rpc error: code = NotFound desc = could not find container \"1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1\": container with ID starting with 1cc30195353a2480132bde85a56b38016932e454c3bd6fb3f738d47d7bf6d8e1 not found: ID does not exist" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.257618 4773 scope.go:117] "RemoveContainer" containerID="8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7" Oct 12 21:25:56 crc kubenswrapper[4773]: E1012 21:25:56.259088 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7\": container with ID starting with 8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7 not found: ID does not exist" containerID="8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.259110 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7"} err="failed to get container status \"8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7\": rpc error: code = NotFound desc = could not find container \"8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7\": container with ID starting with 8fea2a9268245370c4598601574b7eb8ad5ce167d1b23baab0698842f747e9d7 not found: ID does not exist" Oct 12 21:25:56 crc kubenswrapper[4773]: I1012 21:25:56.493676 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" path="/var/lib/kubelet/pods/4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743/volumes" Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.768055 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h5z5r"] Oct 12 21:25:58 crc kubenswrapper[4773]: E1012 21:25:58.768717 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="registry-server" Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.768729 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="registry-server" Oct 12 21:25:58 crc kubenswrapper[4773]: E1012 21:25:58.768781 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="extract-utilities" Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.768787 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="extract-utilities" Oct 12 21:25:58 crc kubenswrapper[4773]: E1012 21:25:58.768799 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="extract-content" Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.768805 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="extract-content" Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.769003 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a3b66f4-9a0e-4f6d-93f3-4b3a230b4743" containerName="registry-server" Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.776834 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.794115 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5z5r"] Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.906342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-utilities\") pod \"community-operators-h5z5r\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.906395 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4ks\" (UniqueName: \"kubernetes.io/projected/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-kube-api-access-kn4ks\") pod \"community-operators-h5z5r\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:58 crc kubenswrapper[4773]: I1012 21:25:58.906485 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-catalog-content\") pod \"community-operators-h5z5r\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:59 crc kubenswrapper[4773]: I1012 21:25:59.008101 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-catalog-content\") pod \"community-operators-h5z5r\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:59 crc kubenswrapper[4773]: I1012 21:25:59.008245 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-utilities\") pod \"community-operators-h5z5r\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:59 crc kubenswrapper[4773]: I1012 21:25:59.008278 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4ks\" (UniqueName: \"kubernetes.io/projected/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-kube-api-access-kn4ks\") pod \"community-operators-h5z5r\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:59 crc kubenswrapper[4773]: I1012 21:25:59.008624 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-catalog-content\") pod \"community-operators-h5z5r\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:59 crc kubenswrapper[4773]: I1012 21:25:59.008718 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-utilities\") pod \"community-operators-h5z5r\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:59 crc kubenswrapper[4773]: I1012 21:25:59.026027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4ks\" (UniqueName: \"kubernetes.io/projected/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-kube-api-access-kn4ks\") pod \"community-operators-h5z5r\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:59 crc kubenswrapper[4773]: I1012 21:25:59.135987 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:25:59 crc kubenswrapper[4773]: I1012 21:25:59.683180 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5z5r"] Oct 12 21:26:00 crc kubenswrapper[4773]: I1012 21:26:00.162388 4773 generic.go:334] "Generic (PLEG): container finished" podID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerID="a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d" exitCode=0 Oct 12 21:26:00 crc kubenswrapper[4773]: I1012 21:26:00.162858 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5z5r" event={"ID":"47703ad3-6ca0-4d7f-897a-f064a6adfbf7","Type":"ContainerDied","Data":"a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d"} Oct 12 21:26:00 crc kubenswrapper[4773]: I1012 21:26:00.162900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5z5r" event={"ID":"47703ad3-6ca0-4d7f-897a-f064a6adfbf7","Type":"ContainerStarted","Data":"550176a044eacfe8b5429bfbf0466d49a591c47faeace692f7bff9c5f721ba88"} Oct 12 21:26:01 crc kubenswrapper[4773]: I1012 21:26:01.171615 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5z5r" event={"ID":"47703ad3-6ca0-4d7f-897a-f064a6adfbf7","Type":"ContainerStarted","Data":"8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7"} Oct 12 21:26:02 crc kubenswrapper[4773]: I1012 21:26:02.182659 4773 generic.go:334] "Generic (PLEG): container finished" podID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerID="8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7" exitCode=0 Oct 12 21:26:02 crc kubenswrapper[4773]: I1012 21:26:02.182788 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5z5r" event={"ID":"47703ad3-6ca0-4d7f-897a-f064a6adfbf7","Type":"ContainerDied","Data":"8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7"} Oct 12 21:26:03 crc kubenswrapper[4773]: I1012 21:26:03.196091 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5z5r" event={"ID":"47703ad3-6ca0-4d7f-897a-f064a6adfbf7","Type":"ContainerStarted","Data":"738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615"} Oct 12 21:26:03 crc kubenswrapper[4773]: I1012 21:26:03.226109 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h5z5r" podStartSLOduration=2.668527437 podStartE2EDuration="5.226092911s" podCreationTimestamp="2025-10-12 21:25:58 +0000 UTC" firstStartedPulling="2025-10-12 21:26:00.165560937 +0000 UTC m=+3708.401859487" lastFinishedPulling="2025-10-12 21:26:02.723126401 +0000 UTC m=+3710.959424961" observedRunningTime="2025-10-12 21:26:03.222524703 +0000 UTC m=+3711.458823293" watchObservedRunningTime="2025-10-12 21:26:03.226092911 +0000 UTC m=+3711.462391471" Oct 12 21:26:03 crc kubenswrapper[4773]: I1012 21:26:03.481457 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:26:03 crc kubenswrapper[4773]: E1012 21:26:03.482297 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:26:09 crc kubenswrapper[4773]: I1012 21:26:09.137041 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:26:09 crc kubenswrapper[4773]: I1012 21:26:09.137576 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:26:09 crc kubenswrapper[4773]: I1012 21:26:09.186546 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:26:09 crc kubenswrapper[4773]: I1012 21:26:09.305018 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:26:09 crc kubenswrapper[4773]: I1012 21:26:09.441145 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5z5r"] Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.267875 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h5z5r" podUID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerName="registry-server" containerID="cri-o://738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615" gracePeriod=2 Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.804888 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.865683 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn4ks\" (UniqueName: \"kubernetes.io/projected/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-kube-api-access-kn4ks\") pod \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.865766 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-catalog-content\") pod \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.865798 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-utilities\") pod \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\" (UID: \"47703ad3-6ca0-4d7f-897a-f064a6adfbf7\") " Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.866926 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-utilities" (OuterVolumeSpecName: "utilities") pod "47703ad3-6ca0-4d7f-897a-f064a6adfbf7" (UID: "47703ad3-6ca0-4d7f-897a-f064a6adfbf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.875991 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-kube-api-access-kn4ks" (OuterVolumeSpecName: "kube-api-access-kn4ks") pod "47703ad3-6ca0-4d7f-897a-f064a6adfbf7" (UID: "47703ad3-6ca0-4d7f-897a-f064a6adfbf7"). InnerVolumeSpecName "kube-api-access-kn4ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.926747 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47703ad3-6ca0-4d7f-897a-f064a6adfbf7" (UID: "47703ad3-6ca0-4d7f-897a-f064a6adfbf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.968550 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn4ks\" (UniqueName: \"kubernetes.io/projected/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-kube-api-access-kn4ks\") on node \"crc\" DevicePath \"\"" Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.968582 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:26:11 crc kubenswrapper[4773]: I1012 21:26:11.968594 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47703ad3-6ca0-4d7f-897a-f064a6adfbf7-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.278164 4773 generic.go:334] "Generic (PLEG): container finished" podID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerID="738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615" exitCode=0 Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.278245 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5z5r" event={"ID":"47703ad3-6ca0-4d7f-897a-f064a6adfbf7","Type":"ContainerDied","Data":"738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615"} Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.278260 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5z5r" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.279119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5z5r" event={"ID":"47703ad3-6ca0-4d7f-897a-f064a6adfbf7","Type":"ContainerDied","Data":"550176a044eacfe8b5429bfbf0466d49a591c47faeace692f7bff9c5f721ba88"} Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.279172 4773 scope.go:117] "RemoveContainer" containerID="738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.299053 4773 scope.go:117] "RemoveContainer" containerID="8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.330524 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5z5r"] Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.340784 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h5z5r"] Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.342546 4773 scope.go:117] "RemoveContainer" containerID="a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.374586 4773 scope.go:117] "RemoveContainer" containerID="738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615" Oct 12 21:26:12 crc kubenswrapper[4773]: E1012 21:26:12.375284 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615\": container with ID starting with 738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615 not found: ID does not exist" containerID="738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.375341 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615"} err="failed to get container status \"738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615\": rpc error: code = NotFound desc = could not find container \"738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615\": container with ID starting with 738bdd276624be8387e239bd3db951b64f96947bdb87384c3948e39666600615 not found: ID does not exist" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.375377 4773 scope.go:117] "RemoveContainer" containerID="8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7" Oct 12 21:26:12 crc kubenswrapper[4773]: E1012 21:26:12.375957 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7\": container with ID starting with 8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7 not found: ID does not exist" containerID="8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.375997 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7"} err="failed to get container status \"8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7\": rpc error: code = NotFound desc = could not find container \"8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7\": container with ID starting with 8a85e0abf0d028f1635f6cedbe1da941da6a48bfa1ea7987a5f81ac0b09b01b7 not found: ID does not exist" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.376026 4773 scope.go:117] "RemoveContainer" containerID="a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d" Oct 12 21:26:12 crc kubenswrapper[4773]: E1012 21:26:12.376323 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d\": container with ID starting with a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d not found: ID does not exist" containerID="a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.376349 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d"} err="failed to get container status \"a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d\": rpc error: code = NotFound desc = could not find container \"a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d\": container with ID starting with a0e9381a58a0a6037f6b96a929e1f5ad23f3ef698304d591411817b396cac60d not found: ID does not exist" Oct 12 21:26:12 crc kubenswrapper[4773]: I1012 21:26:12.493825 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" path="/var/lib/kubelet/pods/47703ad3-6ca0-4d7f-897a-f064a6adfbf7/volumes" Oct 12 21:26:18 crc kubenswrapper[4773]: I1012 21:26:18.481095 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:26:18 crc kubenswrapper[4773]: E1012 21:26:18.481777 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:26:31 crc kubenswrapper[4773]: I1012 21:26:31.481788 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:26:31 crc kubenswrapper[4773]: E1012 21:26:31.482755 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:26:42 crc kubenswrapper[4773]: I1012 21:26:42.489691 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:26:42 crc kubenswrapper[4773]: E1012 21:26:42.491202 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:26:56 crc kubenswrapper[4773]: I1012 21:26:56.482598 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:26:56 crc kubenswrapper[4773]: E1012 21:26:56.483810 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:27:10 crc kubenswrapper[4773]: I1012 21:27:10.481206 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:27:10 crc kubenswrapper[4773]: E1012 21:27:10.482305 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:27:25 crc kubenswrapper[4773]: I1012 21:27:25.482155 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:27:25 crc kubenswrapper[4773]: E1012 21:27:25.484095 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:27:37 crc kubenswrapper[4773]: I1012 21:27:37.484578 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:27:37 crc kubenswrapper[4773]: E1012 21:27:37.485701 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:27:48 crc kubenswrapper[4773]: I1012 21:27:48.481290 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:27:48 crc kubenswrapper[4773]: E1012 21:27:48.482043 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:28:03 crc kubenswrapper[4773]: I1012 21:28:03.481949 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:28:03 crc kubenswrapper[4773]: E1012 21:28:03.482808 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:28:16 crc kubenswrapper[4773]: I1012 21:28:16.481471 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:28:16 crc kubenswrapper[4773]: E1012 21:28:16.482161 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:28:30 crc kubenswrapper[4773]: I1012 21:28:30.485245 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:28:30 crc kubenswrapper[4773]: E1012 21:28:30.486473 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:28:42 crc kubenswrapper[4773]: I1012 21:28:42.486831 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:28:42 crc kubenswrapper[4773]: E1012 21:28:42.487864 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:28:54 crc kubenswrapper[4773]: I1012 21:28:54.090584 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-2bsmz"] Oct 12 21:28:54 crc kubenswrapper[4773]: I1012 21:28:54.101284 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-2bsmz"] Oct 12 21:28:54 crc kubenswrapper[4773]: I1012 21:28:54.520450 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c5eb22-77dc-4904-b52e-9c40d9d488e2" path="/var/lib/kubelet/pods/d3c5eb22-77dc-4904-b52e-9c40d9d488e2/volumes" Oct 12 21:28:56 crc kubenswrapper[4773]: I1012 21:28:56.481917 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:28:56 crc kubenswrapper[4773]: E1012 21:28:56.482441 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:29:06 crc kubenswrapper[4773]: I1012 21:29:06.036467 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-8dc8-account-create-vvvfj"] Oct 12 21:29:06 crc kubenswrapper[4773]: I1012 21:29:06.048236 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-8dc8-account-create-vvvfj"] Oct 12 21:29:06 crc kubenswrapper[4773]: I1012 21:29:06.499914 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30baf00d-ffd7-4717-86a5-a5d97f990a5f" path="/var/lib/kubelet/pods/30baf00d-ffd7-4717-86a5-a5d97f990a5f/volumes" Oct 12 21:29:08 crc kubenswrapper[4773]: I1012 21:29:08.481484 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:29:08 crc kubenswrapper[4773]: E1012 21:29:08.482074 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:29:21 crc kubenswrapper[4773]: I1012 21:29:21.481788 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:29:21 crc kubenswrapper[4773]: E1012 21:29:21.482554 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:29:28 crc kubenswrapper[4773]: I1012 21:29:28.036467 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-hvcc5"] Oct 12 21:29:28 crc kubenswrapper[4773]: I1012 21:29:28.046129 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-hvcc5"] Oct 12 21:29:28 crc kubenswrapper[4773]: I1012 21:29:28.491311 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8d0c7f-e847-428d-968b-10e78d9c3680" path="/var/lib/kubelet/pods/5f8d0c7f-e847-428d-968b-10e78d9c3680/volumes" Oct 12 21:29:32 crc kubenswrapper[4773]: I1012 21:29:32.493912 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:29:32 crc kubenswrapper[4773]: E1012 21:29:32.494786 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:29:45 crc kubenswrapper[4773]: I1012 21:29:45.481605 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:29:45 crc kubenswrapper[4773]: E1012 21:29:45.482854 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:29:46 crc kubenswrapper[4773]: I1012 21:29:46.821343 4773 scope.go:117] "RemoveContainer" containerID="977740cf82be5cb8c53419b0a218650845537a46295c9766904c19142bccc317" Oct 12 21:29:46 crc kubenswrapper[4773]: I1012 21:29:46.853586 4773 scope.go:117] "RemoveContainer" containerID="e026200ac82a1075137ab9d7f523d33a1bd92905ef0743494a6f69a239c36ce0" Oct 12 21:29:46 crc kubenswrapper[4773]: I1012 21:29:46.917542 4773 scope.go:117] "RemoveContainer" containerID="7d4952587b2616c873aea4d8a063eef16df9a555a757847398533a5a453eeec5" Oct 12 21:29:57 crc kubenswrapper[4773]: I1012 21:29:57.481509 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:29:57 crc kubenswrapper[4773]: E1012 21:29:57.485206 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.167332 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd"] Oct 12 21:30:00 crc kubenswrapper[4773]: E1012 21:30:00.168228 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerName="registry-server" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.168248 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerName="registry-server" Oct 12 21:30:00 crc kubenswrapper[4773]: E1012 21:30:00.168273 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerName="extract-content" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.168279 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerName="extract-content" Oct 12 21:30:00 crc kubenswrapper[4773]: E1012 21:30:00.168305 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerName="extract-utilities" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.168312 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerName="extract-utilities" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.168549 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="47703ad3-6ca0-4d7f-897a-f064a6adfbf7" containerName="registry-server" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.169287 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.171951 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.174756 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.177987 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd"] Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.291215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t75r7\" (UniqueName: \"kubernetes.io/projected/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-kube-api-access-t75r7\") pod \"collect-profiles-29338410-nxxwd\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.291294 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-config-volume\") pod \"collect-profiles-29338410-nxxwd\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.291345 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-secret-volume\") pod \"collect-profiles-29338410-nxxwd\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.392549 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t75r7\" (UniqueName: \"kubernetes.io/projected/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-kube-api-access-t75r7\") pod \"collect-profiles-29338410-nxxwd\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.392627 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-config-volume\") pod \"collect-profiles-29338410-nxxwd\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.392690 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-secret-volume\") pod \"collect-profiles-29338410-nxxwd\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.393605 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-config-volume\") pod \"collect-profiles-29338410-nxxwd\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.403592 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-secret-volume\") pod \"collect-profiles-29338410-nxxwd\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.409782 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t75r7\" (UniqueName: \"kubernetes.io/projected/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-kube-api-access-t75r7\") pod \"collect-profiles-29338410-nxxwd\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:00 crc kubenswrapper[4773]: I1012 21:30:00.497286 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:01 crc kubenswrapper[4773]: I1012 21:30:01.152326 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd"] Oct 12 21:30:01 crc kubenswrapper[4773]: I1012 21:30:01.527630 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" event={"ID":"89e13c2e-e314-4e78-b59c-3cbd7359c1f5","Type":"ContainerStarted","Data":"90674b9688dc2805bf70e454b870451732931c60b2c990a7091385e63fe869d4"} Oct 12 21:30:01 crc kubenswrapper[4773]: I1012 21:30:01.528233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" event={"ID":"89e13c2e-e314-4e78-b59c-3cbd7359c1f5","Type":"ContainerStarted","Data":"ae7f47afc4a705ca5149d69b08a929ab38e62ba5081286da64bb01de94488b88"} Oct 12 21:30:01 crc kubenswrapper[4773]: I1012 21:30:01.546402 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" podStartSLOduration=1.546381719 podStartE2EDuration="1.546381719s" podCreationTimestamp="2025-10-12 21:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:30:01.544421255 +0000 UTC m=+3949.780719815" watchObservedRunningTime="2025-10-12 21:30:01.546381719 +0000 UTC m=+3949.782680279" Oct 12 21:30:02 crc kubenswrapper[4773]: I1012 21:30:02.537084 4773 generic.go:334] "Generic (PLEG): container finished" podID="89e13c2e-e314-4e78-b59c-3cbd7359c1f5" containerID="90674b9688dc2805bf70e454b870451732931c60b2c990a7091385e63fe869d4" exitCode=0 Oct 12 21:30:02 crc kubenswrapper[4773]: I1012 21:30:02.537207 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" event={"ID":"89e13c2e-e314-4e78-b59c-3cbd7359c1f5","Type":"ContainerDied","Data":"90674b9688dc2805bf70e454b870451732931c60b2c990a7091385e63fe869d4"} Oct 12 21:30:03 crc kubenswrapper[4773]: I1012 21:30:03.942792 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:03 crc kubenswrapper[4773]: I1012 21:30:03.975578 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-secret-volume\") pod \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " Oct 12 21:30:03 crc kubenswrapper[4773]: I1012 21:30:03.982829 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "89e13c2e-e314-4e78-b59c-3cbd7359c1f5" (UID: "89e13c2e-e314-4e78-b59c-3cbd7359c1f5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.077416 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-config-volume\") pod \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.077499 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t75r7\" (UniqueName: \"kubernetes.io/projected/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-kube-api-access-t75r7\") pod \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\" (UID: \"89e13c2e-e314-4e78-b59c-3cbd7359c1f5\") " Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.077887 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.078078 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-config-volume" (OuterVolumeSpecName: "config-volume") pod "89e13c2e-e314-4e78-b59c-3cbd7359c1f5" (UID: "89e13c2e-e314-4e78-b59c-3cbd7359c1f5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.080589 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-kube-api-access-t75r7" (OuterVolumeSpecName: "kube-api-access-t75r7") pod "89e13c2e-e314-4e78-b59c-3cbd7359c1f5" (UID: "89e13c2e-e314-4e78-b59c-3cbd7359c1f5"). InnerVolumeSpecName "kube-api-access-t75r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.179459 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.179492 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t75r7\" (UniqueName: \"kubernetes.io/projected/89e13c2e-e314-4e78-b59c-3cbd7359c1f5-kube-api-access-t75r7\") on node \"crc\" DevicePath \"\"" Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.215136 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p"] Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.223174 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338365-h522p"] Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.491913 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82148a00-62d2-4597-8574-8726b05e9082" path="/var/lib/kubelet/pods/82148a00-62d2-4597-8574-8726b05e9082/volumes" Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.554674 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" event={"ID":"89e13c2e-e314-4e78-b59c-3cbd7359c1f5","Type":"ContainerDied","Data":"ae7f47afc4a705ca5149d69b08a929ab38e62ba5081286da64bb01de94488b88"} Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.554708 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7f47afc4a705ca5149d69b08a929ab38e62ba5081286da64bb01de94488b88" Oct 12 21:30:04 crc kubenswrapper[4773]: I1012 21:30:04.555859 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338410-nxxwd" Oct 12 21:30:11 crc kubenswrapper[4773]: I1012 21:30:11.481990 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:30:12 crc kubenswrapper[4773]: I1012 21:30:12.633372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"8b77d4ed6176397bd48568d9ec09896763cf275ce46d9df90f03ec49dc270753"} Oct 12 21:30:47 crc kubenswrapper[4773]: I1012 21:30:47.027986 4773 scope.go:117] "RemoveContainer" containerID="56fd99cca96ff004c65d493eeaf6e47bdc6205ac0ad51708d842b078ec07e472" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.583823 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xmjlv"] Oct 12 21:32:11 crc kubenswrapper[4773]: E1012 21:32:11.584679 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e13c2e-e314-4e78-b59c-3cbd7359c1f5" containerName="collect-profiles" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.584692 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e13c2e-e314-4e78-b59c-3cbd7359c1f5" containerName="collect-profiles" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.584939 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e13c2e-e314-4e78-b59c-3cbd7359c1f5" containerName="collect-profiles" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.586160 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.611062 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xmjlv"] Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.746643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-catalog-content\") pod \"redhat-marketplace-xmjlv\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.747847 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhcv\" (UniqueName: \"kubernetes.io/projected/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-kube-api-access-jzhcv\") pod \"redhat-marketplace-xmjlv\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.747981 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-utilities\") pod \"redhat-marketplace-xmjlv\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.849799 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-catalog-content\") pod \"redhat-marketplace-xmjlv\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.850214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-catalog-content\") pod \"redhat-marketplace-xmjlv\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.850445 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhcv\" (UniqueName: \"kubernetes.io/projected/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-kube-api-access-jzhcv\") pod \"redhat-marketplace-xmjlv\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.850561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-utilities\") pod \"redhat-marketplace-xmjlv\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.851022 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-utilities\") pod \"redhat-marketplace-xmjlv\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.867342 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhcv\" (UniqueName: \"kubernetes.io/projected/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-kube-api-access-jzhcv\") pod \"redhat-marketplace-xmjlv\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:11 crc kubenswrapper[4773]: I1012 21:32:11.905638 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:12 crc kubenswrapper[4773]: I1012 21:32:12.237161 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xmjlv"] Oct 12 21:32:12 crc kubenswrapper[4773]: I1012 21:32:12.748438 4773 generic.go:334] "Generic (PLEG): container finished" podID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerID="3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f" exitCode=0 Oct 12 21:32:12 crc kubenswrapper[4773]: I1012 21:32:12.748782 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xmjlv" event={"ID":"d939eab5-6e0a-470b-a00c-ff22b7a49b8b","Type":"ContainerDied","Data":"3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f"} Oct 12 21:32:12 crc kubenswrapper[4773]: I1012 21:32:12.748806 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xmjlv" event={"ID":"d939eab5-6e0a-470b-a00c-ff22b7a49b8b","Type":"ContainerStarted","Data":"d531d805a8e4a6fd78aa4b3aaf21a25f58d5d72c5b1fd3aaa545350ecc72b82f"} Oct 12 21:32:12 crc kubenswrapper[4773]: I1012 21:32:12.751169 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 21:32:13 crc kubenswrapper[4773]: I1012 21:32:13.760362 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xmjlv" event={"ID":"d939eab5-6e0a-470b-a00c-ff22b7a49b8b","Type":"ContainerStarted","Data":"47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440"} Oct 12 21:32:14 crc kubenswrapper[4773]: I1012 21:32:14.772587 4773 generic.go:334] "Generic (PLEG): container finished" podID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerID="47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440" exitCode=0 Oct 12 21:32:14 crc kubenswrapper[4773]: I1012 21:32:14.774286 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xmjlv" event={"ID":"d939eab5-6e0a-470b-a00c-ff22b7a49b8b","Type":"ContainerDied","Data":"47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440"} Oct 12 21:32:14 crc kubenswrapper[4773]: E1012 21:32:14.940101 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-conmon-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache]" Oct 12 21:32:15 crc kubenswrapper[4773]: I1012 21:32:15.790700 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xmjlv" event={"ID":"d939eab5-6e0a-470b-a00c-ff22b7a49b8b","Type":"ContainerStarted","Data":"4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b"} Oct 12 21:32:21 crc kubenswrapper[4773]: I1012 21:32:21.906790 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:21 crc kubenswrapper[4773]: I1012 21:32:21.907477 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:21 crc kubenswrapper[4773]: I1012 21:32:21.965093 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:21 crc kubenswrapper[4773]: I1012 21:32:21.999369 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xmjlv" podStartSLOduration=8.549889757999999 podStartE2EDuration="10.999346009s" podCreationTimestamp="2025-10-12 21:32:11 +0000 UTC" firstStartedPulling="2025-10-12 21:32:12.750838641 +0000 UTC m=+4080.987137191" lastFinishedPulling="2025-10-12 21:32:15.200294882 +0000 UTC m=+4083.436593442" observedRunningTime="2025-10-12 21:32:15.814217058 +0000 UTC m=+4084.050515618" watchObservedRunningTime="2025-10-12 21:32:21.999346009 +0000 UTC m=+4090.235644579" Oct 12 21:32:22 crc kubenswrapper[4773]: I1012 21:32:22.919639 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:22 crc kubenswrapper[4773]: I1012 21:32:22.971912 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xmjlv"] Oct 12 21:32:24 crc kubenswrapper[4773]: I1012 21:32:24.888649 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xmjlv" podUID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerName="registry-server" containerID="cri-o://4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b" gracePeriod=2 Oct 12 21:32:25 crc kubenswrapper[4773]: E1012 21:32:25.167579 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-conmon-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache]" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.540020 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.633093 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-utilities\") pod \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.633444 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-catalog-content\") pod \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.633501 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzhcv\" (UniqueName: \"kubernetes.io/projected/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-kube-api-access-jzhcv\") pod \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\" (UID: \"d939eab5-6e0a-470b-a00c-ff22b7a49b8b\") " Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.634916 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-utilities" (OuterVolumeSpecName: "utilities") pod "d939eab5-6e0a-470b-a00c-ff22b7a49b8b" (UID: "d939eab5-6e0a-470b-a00c-ff22b7a49b8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.639574 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-kube-api-access-jzhcv" (OuterVolumeSpecName: "kube-api-access-jzhcv") pod "d939eab5-6e0a-470b-a00c-ff22b7a49b8b" (UID: "d939eab5-6e0a-470b-a00c-ff22b7a49b8b"). InnerVolumeSpecName "kube-api-access-jzhcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.651929 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d939eab5-6e0a-470b-a00c-ff22b7a49b8b" (UID: "d939eab5-6e0a-470b-a00c-ff22b7a49b8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.734890 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.735247 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzhcv\" (UniqueName: \"kubernetes.io/projected/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-kube-api-access-jzhcv\") on node \"crc\" DevicePath \"\"" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.735260 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d939eab5-6e0a-470b-a00c-ff22b7a49b8b-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.899620 4773 generic.go:334] "Generic (PLEG): container finished" podID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerID="4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b" exitCode=0 Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.899668 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xmjlv" event={"ID":"d939eab5-6e0a-470b-a00c-ff22b7a49b8b","Type":"ContainerDied","Data":"4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b"} Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.899700 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xmjlv" event={"ID":"d939eab5-6e0a-470b-a00c-ff22b7a49b8b","Type":"ContainerDied","Data":"d531d805a8e4a6fd78aa4b3aaf21a25f58d5d72c5b1fd3aaa545350ecc72b82f"} Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.899736 4773 scope.go:117] "RemoveContainer" containerID="4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.899862 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xmjlv" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.935247 4773 scope.go:117] "RemoveContainer" containerID="47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440" Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.942212 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xmjlv"] Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.949732 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xmjlv"] Oct 12 21:32:25 crc kubenswrapper[4773]: I1012 21:32:25.963235 4773 scope.go:117] "RemoveContainer" containerID="3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f" Oct 12 21:32:26 crc kubenswrapper[4773]: I1012 21:32:26.011223 4773 scope.go:117] "RemoveContainer" containerID="4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b" Oct 12 21:32:26 crc kubenswrapper[4773]: E1012 21:32:26.012046 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b\": container with ID starting with 4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b not found: ID does not exist" containerID="4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b" Oct 12 21:32:26 crc kubenswrapper[4773]: I1012 21:32:26.012108 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b"} err="failed to get container status \"4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b\": rpc error: code = NotFound desc = could not find container \"4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b\": container with ID starting with 4d39a47dba2cc6030c6a677c7927017a4bd1404f87d1b2e7d0ba9740c44e492b not found: ID does not exist" Oct 12 21:32:26 crc kubenswrapper[4773]: I1012 21:32:26.012145 4773 scope.go:117] "RemoveContainer" containerID="47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440" Oct 12 21:32:26 crc kubenswrapper[4773]: E1012 21:32:26.012532 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440\": container with ID starting with 47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440 not found: ID does not exist" containerID="47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440" Oct 12 21:32:26 crc kubenswrapper[4773]: I1012 21:32:26.012564 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440"} err="failed to get container status \"47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440\": rpc error: code = NotFound desc = could not find container \"47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440\": container with ID starting with 47cd2856bcacce636aca722c6bddf9941aa02ac31fbdf6a48e144b4c37869440 not found: ID does not exist" Oct 12 21:32:26 crc kubenswrapper[4773]: I1012 21:32:26.012584 4773 scope.go:117] "RemoveContainer" containerID="3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f" Oct 12 21:32:26 crc kubenswrapper[4773]: E1012 21:32:26.013012 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f\": container with ID starting with 3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f not found: ID does not exist" containerID="3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f" Oct 12 21:32:26 crc kubenswrapper[4773]: I1012 21:32:26.013047 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f"} err="failed to get container status \"3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f\": rpc error: code = NotFound desc = could not find container \"3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f\": container with ID starting with 3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f not found: ID does not exist" Oct 12 21:32:26 crc kubenswrapper[4773]: I1012 21:32:26.496523 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" path="/var/lib/kubelet/pods/d939eab5-6e0a-470b-a00c-ff22b7a49b8b/volumes" Oct 12 21:32:28 crc kubenswrapper[4773]: I1012 21:32:28.669247 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:32:28 crc kubenswrapper[4773]: I1012 21:32:28.669979 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:32:35 crc kubenswrapper[4773]: E1012 21:32:35.398699 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-conmon-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache]" Oct 12 21:32:45 crc kubenswrapper[4773]: E1012 21:32:45.629166 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-conmon-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache]" Oct 12 21:32:55 crc kubenswrapper[4773]: E1012 21:32:55.860474 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-conmon-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache]" Oct 12 21:32:58 crc kubenswrapper[4773]: I1012 21:32:58.669178 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:32:58 crc kubenswrapper[4773]: I1012 21:32:58.669962 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:33:06 crc kubenswrapper[4773]: E1012 21:33:06.096951 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-conmon-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd939eab5_6e0a_470b_a00c_ff22b7a49b8b.slice/crio-3775b551424c5cbd6a88592ad3e138284d42bb1a32af50a80cf995036c848d8f.scope\": RecentStats: unable to find data in memory cache]" Oct 12 21:33:28 crc kubenswrapper[4773]: I1012 21:33:28.669894 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:33:28 crc kubenswrapper[4773]: I1012 21:33:28.670524 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:33:28 crc kubenswrapper[4773]: I1012 21:33:28.670574 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 21:33:28 crc kubenswrapper[4773]: I1012 21:33:28.671513 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b77d4ed6176397bd48568d9ec09896763cf275ce46d9df90f03ec49dc270753"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 21:33:28 crc kubenswrapper[4773]: I1012 21:33:28.671572 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://8b77d4ed6176397bd48568d9ec09896763cf275ce46d9df90f03ec49dc270753" gracePeriod=600 Oct 12 21:33:29 crc kubenswrapper[4773]: I1012 21:33:29.548138 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="8b77d4ed6176397bd48568d9ec09896763cf275ce46d9df90f03ec49dc270753" exitCode=0 Oct 12 21:33:29 crc kubenswrapper[4773]: I1012 21:33:29.548211 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"8b77d4ed6176397bd48568d9ec09896763cf275ce46d9df90f03ec49dc270753"} Oct 12 21:33:29 crc kubenswrapper[4773]: I1012 21:33:29.548609 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b"} Oct 12 21:33:29 crc kubenswrapper[4773]: I1012 21:33:29.548635 4773 scope.go:117] "RemoveContainer" containerID="4658c88275e82df45edaddb9263200ef6b25fe7212b37ca2c50f5af0290232a5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.195894 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fksq5"] Oct 12 21:33:45 crc kubenswrapper[4773]: E1012 21:33:45.196589 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerName="extract-utilities" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.196601 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerName="extract-utilities" Oct 12 21:33:45 crc kubenswrapper[4773]: E1012 21:33:45.196628 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerName="extract-content" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.196634 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerName="extract-content" Oct 12 21:33:45 crc kubenswrapper[4773]: E1012 21:33:45.196650 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerName="registry-server" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.196656 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerName="registry-server" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.196849 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d939eab5-6e0a-470b-a00c-ff22b7a49b8b" containerName="registry-server" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.198033 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.223845 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fksq5"] Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.334658 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-utilities\") pod \"certified-operators-fksq5\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.334777 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qmwd\" (UniqueName: \"kubernetes.io/projected/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-kube-api-access-5qmwd\") pod \"certified-operators-fksq5\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.334833 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-catalog-content\") pod \"certified-operators-fksq5\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.437238 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-utilities\") pod \"certified-operators-fksq5\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.437336 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qmwd\" (UniqueName: \"kubernetes.io/projected/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-kube-api-access-5qmwd\") pod \"certified-operators-fksq5\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.437375 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-catalog-content\") pod \"certified-operators-fksq5\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.437765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-utilities\") pod \"certified-operators-fksq5\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.438162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-catalog-content\") pod \"certified-operators-fksq5\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.458585 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qmwd\" (UniqueName: \"kubernetes.io/projected/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-kube-api-access-5qmwd\") pod \"certified-operators-fksq5\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:45 crc kubenswrapper[4773]: I1012 21:33:45.519475 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:46 crc kubenswrapper[4773]: I1012 21:33:46.083417 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fksq5"] Oct 12 21:33:46 crc kubenswrapper[4773]: I1012 21:33:46.722697 4773 generic.go:334] "Generic (PLEG): container finished" podID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerID="0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef" exitCode=0 Oct 12 21:33:46 crc kubenswrapper[4773]: I1012 21:33:46.722899 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fksq5" event={"ID":"1e0fa7dd-2486-4cbf-98e7-38d0663cf065","Type":"ContainerDied","Data":"0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef"} Oct 12 21:33:46 crc kubenswrapper[4773]: I1012 21:33:46.723271 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fksq5" event={"ID":"1e0fa7dd-2486-4cbf-98e7-38d0663cf065","Type":"ContainerStarted","Data":"491e218175ca3679f68572c8363c16082471032145f2fe9a7e07a0521576104c"} Oct 12 21:33:48 crc kubenswrapper[4773]: I1012 21:33:48.746101 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fksq5" event={"ID":"1e0fa7dd-2486-4cbf-98e7-38d0663cf065","Type":"ContainerStarted","Data":"09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2"} Oct 12 21:33:50 crc kubenswrapper[4773]: I1012 21:33:50.765664 4773 generic.go:334] "Generic (PLEG): container finished" podID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerID="09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2" exitCode=0 Oct 12 21:33:50 crc kubenswrapper[4773]: I1012 21:33:50.765872 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fksq5" event={"ID":"1e0fa7dd-2486-4cbf-98e7-38d0663cf065","Type":"ContainerDied","Data":"09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2"} Oct 12 21:33:51 crc kubenswrapper[4773]: I1012 21:33:51.779117 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fksq5" event={"ID":"1e0fa7dd-2486-4cbf-98e7-38d0663cf065","Type":"ContainerStarted","Data":"3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a"} Oct 12 21:33:51 crc kubenswrapper[4773]: I1012 21:33:51.803142 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fksq5" podStartSLOduration=2.26208572 podStartE2EDuration="6.803123401s" podCreationTimestamp="2025-10-12 21:33:45 +0000 UTC" firstStartedPulling="2025-10-12 21:33:46.725104532 +0000 UTC m=+4174.961403102" lastFinishedPulling="2025-10-12 21:33:51.266142223 +0000 UTC m=+4179.502440783" observedRunningTime="2025-10-12 21:33:51.801462366 +0000 UTC m=+4180.037760926" watchObservedRunningTime="2025-10-12 21:33:51.803123401 +0000 UTC m=+4180.039421961" Oct 12 21:33:55 crc kubenswrapper[4773]: I1012 21:33:55.520131 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:55 crc kubenswrapper[4773]: I1012 21:33:55.520773 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:33:56 crc kubenswrapper[4773]: I1012 21:33:56.567903 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-fksq5" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerName="registry-server" probeResult="failure" output=< Oct 12 21:33:56 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:33:56 crc kubenswrapper[4773]: > Oct 12 21:34:05 crc kubenswrapper[4773]: I1012 21:34:05.587059 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:34:05 crc kubenswrapper[4773]: I1012 21:34:05.656557 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:34:05 crc kubenswrapper[4773]: I1012 21:34:05.864903 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fksq5"] Oct 12 21:34:06 crc kubenswrapper[4773]: I1012 21:34:06.927959 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fksq5" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerName="registry-server" containerID="cri-o://3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a" gracePeriod=2 Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.485067 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.616386 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-utilities\") pod \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.616567 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qmwd\" (UniqueName: \"kubernetes.io/projected/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-kube-api-access-5qmwd\") pod \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.616639 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-catalog-content\") pod \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\" (UID: \"1e0fa7dd-2486-4cbf-98e7-38d0663cf065\") " Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.617339 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-utilities" (OuterVolumeSpecName: "utilities") pod "1e0fa7dd-2486-4cbf-98e7-38d0663cf065" (UID: "1e0fa7dd-2486-4cbf-98e7-38d0663cf065"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.638938 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-kube-api-access-5qmwd" (OuterVolumeSpecName: "kube-api-access-5qmwd") pod "1e0fa7dd-2486-4cbf-98e7-38d0663cf065" (UID: "1e0fa7dd-2486-4cbf-98e7-38d0663cf065"). InnerVolumeSpecName "kube-api-access-5qmwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.695882 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e0fa7dd-2486-4cbf-98e7-38d0663cf065" (UID: "1e0fa7dd-2486-4cbf-98e7-38d0663cf065"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.719013 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qmwd\" (UniqueName: \"kubernetes.io/projected/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-kube-api-access-5qmwd\") on node \"crc\" DevicePath \"\"" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.719069 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.719079 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e0fa7dd-2486-4cbf-98e7-38d0663cf065-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.938820 4773 generic.go:334] "Generic (PLEG): container finished" podID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerID="3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a" exitCode=0 Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.938859 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fksq5" event={"ID":"1e0fa7dd-2486-4cbf-98e7-38d0663cf065","Type":"ContainerDied","Data":"3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a"} Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.938884 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fksq5" event={"ID":"1e0fa7dd-2486-4cbf-98e7-38d0663cf065","Type":"ContainerDied","Data":"491e218175ca3679f68572c8363c16082471032145f2fe9a7e07a0521576104c"} Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.938900 4773 scope.go:117] "RemoveContainer" containerID="3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.938904 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fksq5" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.962327 4773 scope.go:117] "RemoveContainer" containerID="09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2" Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.970820 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fksq5"] Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.978709 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fksq5"] Oct 12 21:34:07 crc kubenswrapper[4773]: I1012 21:34:07.991028 4773 scope.go:117] "RemoveContainer" containerID="0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef" Oct 12 21:34:08 crc kubenswrapper[4773]: I1012 21:34:08.066887 4773 scope.go:117] "RemoveContainer" containerID="3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a" Oct 12 21:34:08 crc kubenswrapper[4773]: E1012 21:34:08.067783 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a\": container with ID starting with 3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a not found: ID does not exist" containerID="3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a" Oct 12 21:34:08 crc kubenswrapper[4773]: I1012 21:34:08.067821 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a"} err="failed to get container status \"3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a\": rpc error: code = NotFound desc = could not find container \"3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a\": container with ID starting with 3833952b4a3f4030db8ddee3c9dc51b31374dba26f3e52c525ebff170c6cc43a not found: ID does not exist" Oct 12 21:34:08 crc kubenswrapper[4773]: I1012 21:34:08.067848 4773 scope.go:117] "RemoveContainer" containerID="09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2" Oct 12 21:34:08 crc kubenswrapper[4773]: E1012 21:34:08.068117 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2\": container with ID starting with 09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2 not found: ID does not exist" containerID="09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2" Oct 12 21:34:08 crc kubenswrapper[4773]: I1012 21:34:08.068140 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2"} err="failed to get container status \"09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2\": rpc error: code = NotFound desc = could not find container \"09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2\": container with ID starting with 09795e01759834d7f77eef2fd1dcdeec8fdcbc70175f3d6b6efaaf1bf58d6ce2 not found: ID does not exist" Oct 12 21:34:08 crc kubenswrapper[4773]: I1012 21:34:08.068158 4773 scope.go:117] "RemoveContainer" containerID="0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef" Oct 12 21:34:08 crc kubenswrapper[4773]: E1012 21:34:08.068387 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef\": container with ID starting with 0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef not found: ID does not exist" containerID="0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef" Oct 12 21:34:08 crc kubenswrapper[4773]: I1012 21:34:08.068414 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef"} err="failed to get container status \"0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef\": rpc error: code = NotFound desc = could not find container \"0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef\": container with ID starting with 0336d1a61dfdae5f5619e7c7fa950bcc1c202659ab5d871a81cc1ad18fd8bcef not found: ID does not exist" Oct 12 21:34:08 crc kubenswrapper[4773]: I1012 21:34:08.492508 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" path="/var/lib/kubelet/pods/1e0fa7dd-2486-4cbf-98e7-38d0663cf065/volumes" Oct 12 21:35:25 crc kubenswrapper[4773]: I1012 21:35:25.696188 4773 generic.go:334] "Generic (PLEG): container finished" podID="4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" containerID="31097c067ff5a7b3a735df9f1ca4d9bbde1cbb93f572dcceebb5f2fd754ce2de" exitCode=0 Oct 12 21:35:25 crc kubenswrapper[4773]: I1012 21:35:25.696329 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345","Type":"ContainerDied","Data":"31097c067ff5a7b3a735df9f1ca4d9bbde1cbb93f572dcceebb5f2fd754ce2de"} Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.221816 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.320925 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config\") pod \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.321075 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.321141 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp42w\" (UniqueName: \"kubernetes.io/projected/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-kube-api-access-tp42w\") pod \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.321212 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ssh-key\") pod \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.321250 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-workdir\") pod \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.321272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-temporary\") pod \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.321291 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-config-data\") pod \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.321348 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ca-certs\") pod \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.321386 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config-secret\") pod \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\" (UID: \"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345\") " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.322823 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" (UID: "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.322944 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-config-data" (OuterVolumeSpecName: "config-data") pod "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" (UID: "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.327888 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" (UID: "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.338089 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" (UID: "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.338198 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-kube-api-access-tp42w" (OuterVolumeSpecName: "kube-api-access-tp42w") pod "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" (UID: "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345"). InnerVolumeSpecName "kube-api-access-tp42w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.374108 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" (UID: "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.395653 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" (UID: "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.401445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" (UID: "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.402684 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" (UID: "4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.424299 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.424556 4773 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.424696 4773 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.424829 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.424918 4773 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.425220 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.426064 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.426820 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.426954 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp42w\" (UniqueName: \"kubernetes.io/projected/4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345-kube-api-access-tp42w\") on node \"crc\" DevicePath \"\"" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.447438 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.528922 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.725307 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345","Type":"ContainerDied","Data":"e6fa5d69924b08e6320ffb9859d922e94c7070e16d95d1c4597094e932d06a1e"} Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.725341 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6fa5d69924b08e6320ffb9859d922e94c7070e16d95d1c4597094e932d06a1e" Oct 12 21:35:27 crc kubenswrapper[4773]: I1012 21:35:27.725392 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.412211 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 12 21:35:39 crc kubenswrapper[4773]: E1012 21:35:39.413663 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerName="registry-server" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.413688 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerName="registry-server" Oct 12 21:35:39 crc kubenswrapper[4773]: E1012 21:35:39.413753 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" containerName="tempest-tests-tempest-tests-runner" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.413773 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" containerName="tempest-tests-tempest-tests-runner" Oct 12 21:35:39 crc kubenswrapper[4773]: E1012 21:35:39.413809 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerName="extract-utilities" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.413829 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerName="extract-utilities" Oct 12 21:35:39 crc kubenswrapper[4773]: E1012 21:35:39.413857 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerName="extract-content" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.413870 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerName="extract-content" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.414287 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0fa7dd-2486-4cbf-98e7-38d0663cf065" containerName="registry-server" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.414324 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345" containerName="tempest-tests-tempest-tests-runner" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.415540 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.423392 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.423895 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-q5v5g" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.485536 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8b33e95-51a4-42fe-a706-fc146ef7ce27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.485636 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhcxh\" (UniqueName: \"kubernetes.io/projected/b8b33e95-51a4-42fe-a706-fc146ef7ce27-kube-api-access-dhcxh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8b33e95-51a4-42fe-a706-fc146ef7ce27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.587269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8b33e95-51a4-42fe-a706-fc146ef7ce27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.587394 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhcxh\" (UniqueName: \"kubernetes.io/projected/b8b33e95-51a4-42fe-a706-fc146ef7ce27-kube-api-access-dhcxh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8b33e95-51a4-42fe-a706-fc146ef7ce27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.588102 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8b33e95-51a4-42fe-a706-fc146ef7ce27\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.721511 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhcxh\" (UniqueName: \"kubernetes.io/projected/b8b33e95-51a4-42fe-a706-fc146ef7ce27-kube-api-access-dhcxh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8b33e95-51a4-42fe-a706-fc146ef7ce27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 21:35:39 crc kubenswrapper[4773]: I1012 21:35:39.754702 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8b33e95-51a4-42fe-a706-fc146ef7ce27\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 21:35:40 crc kubenswrapper[4773]: I1012 21:35:40.053313 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 21:35:40 crc kubenswrapper[4773]: I1012 21:35:40.579937 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 12 21:35:40 crc kubenswrapper[4773]: W1012 21:35:40.605218 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8b33e95_51a4_42fe_a706_fc146ef7ce27.slice/crio-7f09dfbaf917173fa3a547d5890eac3a7c3ec3269da8b5379df0d3065c491425 WatchSource:0}: Error finding container 7f09dfbaf917173fa3a547d5890eac3a7c3ec3269da8b5379df0d3065c491425: Status 404 returned error can't find the container with id 7f09dfbaf917173fa3a547d5890eac3a7c3ec3269da8b5379df0d3065c491425 Oct 12 21:35:40 crc kubenswrapper[4773]: I1012 21:35:40.858102 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b8b33e95-51a4-42fe-a706-fc146ef7ce27","Type":"ContainerStarted","Data":"7f09dfbaf917173fa3a547d5890eac3a7c3ec3269da8b5379df0d3065c491425"} Oct 12 21:35:42 crc kubenswrapper[4773]: I1012 21:35:42.887100 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b8b33e95-51a4-42fe-a706-fc146ef7ce27","Type":"ContainerStarted","Data":"96920dd1ce8241e3ae99eabcca8737ba8230cf048827fdd5232eedee583facff"} Oct 12 21:35:42 crc kubenswrapper[4773]: I1012 21:35:42.907054 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.805834204 podStartE2EDuration="3.907033842s" podCreationTimestamp="2025-10-12 21:35:39 +0000 UTC" firstStartedPulling="2025-10-12 21:35:40.617428397 +0000 UTC m=+4288.853726947" lastFinishedPulling="2025-10-12 21:35:41.718627985 +0000 UTC m=+4289.954926585" observedRunningTime="2025-10-12 21:35:42.904644006 +0000 UTC m=+4291.140942576" watchObservedRunningTime="2025-10-12 21:35:42.907033842 +0000 UTC m=+4291.143332422" Oct 12 21:35:58 crc kubenswrapper[4773]: I1012 21:35:58.669131 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:35:58 crc kubenswrapper[4773]: I1012 21:35:58.669665 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.527913 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6fbk8/must-gather-cr6hf"] Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.529528 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.532359 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6fbk8"/"default-dockercfg-ds8rf" Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.533182 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6fbk8"/"kube-root-ca.crt" Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.533532 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6fbk8"/"openshift-service-ca.crt" Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.545626 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fbk8/must-gather-cr6hf"] Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.643940 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d978211-35bc-410c-a460-964734ca5359-must-gather-output\") pod \"must-gather-cr6hf\" (UID: \"8d978211-35bc-410c-a460-964734ca5359\") " pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.644072 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrs69\" (UniqueName: \"kubernetes.io/projected/8d978211-35bc-410c-a460-964734ca5359-kube-api-access-lrs69\") pod \"must-gather-cr6hf\" (UID: \"8d978211-35bc-410c-a460-964734ca5359\") " pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.745770 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrs69\" (UniqueName: \"kubernetes.io/projected/8d978211-35bc-410c-a460-964734ca5359-kube-api-access-lrs69\") pod \"must-gather-cr6hf\" (UID: \"8d978211-35bc-410c-a460-964734ca5359\") " pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.745972 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d978211-35bc-410c-a460-964734ca5359-must-gather-output\") pod \"must-gather-cr6hf\" (UID: \"8d978211-35bc-410c-a460-964734ca5359\") " pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:35:59 crc kubenswrapper[4773]: I1012 21:35:59.746331 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d978211-35bc-410c-a460-964734ca5359-must-gather-output\") pod \"must-gather-cr6hf\" (UID: \"8d978211-35bc-410c-a460-964734ca5359\") " pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:36:00 crc kubenswrapper[4773]: I1012 21:36:00.018901 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrs69\" (UniqueName: \"kubernetes.io/projected/8d978211-35bc-410c-a460-964734ca5359-kube-api-access-lrs69\") pod \"must-gather-cr6hf\" (UID: \"8d978211-35bc-410c-a460-964734ca5359\") " pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:36:00 crc kubenswrapper[4773]: I1012 21:36:00.147356 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:36:00 crc kubenswrapper[4773]: W1012 21:36:00.642131 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d978211_35bc_410c_a460_964734ca5359.slice/crio-1dc8f002096000a9cfb34e8c296854e3dfbf89735b15c1b9e40ee906fb4a09aa WatchSource:0}: Error finding container 1dc8f002096000a9cfb34e8c296854e3dfbf89735b15c1b9e40ee906fb4a09aa: Status 404 returned error can't find the container with id 1dc8f002096000a9cfb34e8c296854e3dfbf89735b15c1b9e40ee906fb4a09aa Oct 12 21:36:00 crc kubenswrapper[4773]: I1012 21:36:00.646271 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fbk8/must-gather-cr6hf"] Oct 12 21:36:01 crc kubenswrapper[4773]: I1012 21:36:01.051941 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" event={"ID":"8d978211-35bc-410c-a460-964734ca5359","Type":"ContainerStarted","Data":"1dc8f002096000a9cfb34e8c296854e3dfbf89735b15c1b9e40ee906fb4a09aa"} Oct 12 21:36:06 crc kubenswrapper[4773]: I1012 21:36:06.095942 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" event={"ID":"8d978211-35bc-410c-a460-964734ca5359","Type":"ContainerStarted","Data":"a540a1a1060de67ea91fbb3435120fbea9a613094c5044b4deec0ef92db02c44"} Oct 12 21:36:07 crc kubenswrapper[4773]: I1012 21:36:07.106228 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" event={"ID":"8d978211-35bc-410c-a460-964734ca5359","Type":"ContainerStarted","Data":"adcfa4430e5f1fde13c9b12c6b2477723ddefba8a8c86198a579fead890e29b1"} Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.609263 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" podStartSLOduration=9.84714836 podStartE2EDuration="14.609240544s" podCreationTimestamp="2025-10-12 21:35:59 +0000 UTC" firstStartedPulling="2025-10-12 21:36:00.645578112 +0000 UTC m=+4308.881876672" lastFinishedPulling="2025-10-12 21:36:05.407670276 +0000 UTC m=+4313.643968856" observedRunningTime="2025-10-12 21:36:07.128911219 +0000 UTC m=+4315.365209799" watchObservedRunningTime="2025-10-12 21:36:13.609240544 +0000 UTC m=+4321.845539104" Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.616041 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6fbk8/crc-debug-fvphg"] Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.617228 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.795864 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl7jz\" (UniqueName: \"kubernetes.io/projected/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-kube-api-access-dl7jz\") pod \"crc-debug-fvphg\" (UID: \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\") " pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.796221 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-host\") pod \"crc-debug-fvphg\" (UID: \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\") " pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.898137 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl7jz\" (UniqueName: \"kubernetes.io/projected/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-kube-api-access-dl7jz\") pod \"crc-debug-fvphg\" (UID: \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\") " pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.898193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-host\") pod \"crc-debug-fvphg\" (UID: \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\") " pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.898345 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-host\") pod \"crc-debug-fvphg\" (UID: \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\") " pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.914847 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl7jz\" (UniqueName: \"kubernetes.io/projected/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-kube-api-access-dl7jz\") pod \"crc-debug-fvphg\" (UID: \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\") " pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:36:13 crc kubenswrapper[4773]: I1012 21:36:13.934086 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:36:13 crc kubenswrapper[4773]: W1012 21:36:13.983444 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab77fefe_4fcd_4f10_b46c_f9f4a03687a5.slice/crio-6b3245d6f0d958032793952b3fdc0cfd39360e40df12e7318509b714a00a61bf WatchSource:0}: Error finding container 6b3245d6f0d958032793952b3fdc0cfd39360e40df12e7318509b714a00a61bf: Status 404 returned error can't find the container with id 6b3245d6f0d958032793952b3fdc0cfd39360e40df12e7318509b714a00a61bf Oct 12 21:36:14 crc kubenswrapper[4773]: I1012 21:36:14.191153 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/crc-debug-fvphg" event={"ID":"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5","Type":"ContainerStarted","Data":"6b3245d6f0d958032793952b3fdc0cfd39360e40df12e7318509b714a00a61bf"} Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.671713 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xd8h5"] Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.686634 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.702577 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xd8h5"] Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.761385 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2sn\" (UniqueName: \"kubernetes.io/projected/07a36b0c-00aa-4a84-a735-074d2e448f55-kube-api-access-kh2sn\") pod \"redhat-operators-xd8h5\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.764360 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-utilities\") pod \"redhat-operators-xd8h5\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.764710 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-catalog-content\") pod \"redhat-operators-xd8h5\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.867388 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2sn\" (UniqueName: \"kubernetes.io/projected/07a36b0c-00aa-4a84-a735-074d2e448f55-kube-api-access-kh2sn\") pod \"redhat-operators-xd8h5\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.867515 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-utilities\") pod \"redhat-operators-xd8h5\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.867577 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-catalog-content\") pod \"redhat-operators-xd8h5\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.868186 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-utilities\") pod \"redhat-operators-xd8h5\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.868227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-catalog-content\") pod \"redhat-operators-xd8h5\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:19 crc kubenswrapper[4773]: I1012 21:36:19.903521 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2sn\" (UniqueName: \"kubernetes.io/projected/07a36b0c-00aa-4a84-a735-074d2e448f55-kube-api-access-kh2sn\") pod \"redhat-operators-xd8h5\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:20 crc kubenswrapper[4773]: I1012 21:36:20.025906 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.003652 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z4mz7"] Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.007680 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.021969 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z4mz7"] Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.026106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-utilities\") pod \"community-operators-z4mz7\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.026248 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-catalog-content\") pod \"community-operators-z4mz7\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.026539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wmlw\" (UniqueName: \"kubernetes.io/projected/7b134f77-c348-44db-aee6-bcb45ec1e9e2-kube-api-access-2wmlw\") pod \"community-operators-z4mz7\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.128022 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-utilities\") pod \"community-operators-z4mz7\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.128149 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-catalog-content\") pod \"community-operators-z4mz7\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.128215 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wmlw\" (UniqueName: \"kubernetes.io/projected/7b134f77-c348-44db-aee6-bcb45ec1e9e2-kube-api-access-2wmlw\") pod \"community-operators-z4mz7\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.128973 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-utilities\") pod \"community-operators-z4mz7\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.129213 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-catalog-content\") pod \"community-operators-z4mz7\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.165228 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wmlw\" (UniqueName: \"kubernetes.io/projected/7b134f77-c348-44db-aee6-bcb45ec1e9e2-kube-api-access-2wmlw\") pod \"community-operators-z4mz7\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:23 crc kubenswrapper[4773]: I1012 21:36:23.365957 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:26 crc kubenswrapper[4773]: I1012 21:36:26.620926 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xd8h5"] Oct 12 21:36:26 crc kubenswrapper[4773]: I1012 21:36:26.676752 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z4mz7"] Oct 12 21:36:27 crc kubenswrapper[4773]: I1012 21:36:27.360584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd8h5" event={"ID":"07a36b0c-00aa-4a84-a735-074d2e448f55","Type":"ContainerStarted","Data":"ff4d81fecdfd9496bddaf6cfb2c6c779393265b7f779133b648b00c157bcc064"} Oct 12 21:36:27 crc kubenswrapper[4773]: I1012 21:36:27.364197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/crc-debug-fvphg" event={"ID":"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5","Type":"ContainerStarted","Data":"6c7d2ebb2ed766cfad591a25942f1f2606a61da7cfd06c7cba09af1e105eb07a"} Oct 12 21:36:27 crc kubenswrapper[4773]: I1012 21:36:27.367638 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mz7" event={"ID":"7b134f77-c348-44db-aee6-bcb45ec1e9e2","Type":"ContainerStarted","Data":"d207c0ef343fe1d195cc0ba17da8eca135d5b4865e29e208e5f5e7a2648efde9"} Oct 12 21:36:27 crc kubenswrapper[4773]: I1012 21:36:27.399921 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6fbk8/crc-debug-fvphg" podStartSLOduration=2.108973031 podStartE2EDuration="14.399899757s" podCreationTimestamp="2025-10-12 21:36:13 +0000 UTC" firstStartedPulling="2025-10-12 21:36:13.986182578 +0000 UTC m=+4322.222481138" lastFinishedPulling="2025-10-12 21:36:26.277109304 +0000 UTC m=+4334.513407864" observedRunningTime="2025-10-12 21:36:27.396564005 +0000 UTC m=+4335.632862565" watchObservedRunningTime="2025-10-12 21:36:27.399899757 +0000 UTC m=+4335.636198327" Oct 12 21:36:28 crc kubenswrapper[4773]: I1012 21:36:28.376851 4773 generic.go:334] "Generic (PLEG): container finished" podID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerID="8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379" exitCode=0 Oct 12 21:36:28 crc kubenswrapper[4773]: I1012 21:36:28.376926 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd8h5" event={"ID":"07a36b0c-00aa-4a84-a735-074d2e448f55","Type":"ContainerDied","Data":"8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379"} Oct 12 21:36:28 crc kubenswrapper[4773]: I1012 21:36:28.380863 4773 generic.go:334] "Generic (PLEG): container finished" podID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerID="74bf1253aff1ee08b9540c5e5d4eff8bf0488730e94d06bef0f48a21a5e18ea8" exitCode=0 Oct 12 21:36:28 crc kubenswrapper[4773]: I1012 21:36:28.381752 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mz7" event={"ID":"7b134f77-c348-44db-aee6-bcb45ec1e9e2","Type":"ContainerDied","Data":"74bf1253aff1ee08b9540c5e5d4eff8bf0488730e94d06bef0f48a21a5e18ea8"} Oct 12 21:36:28 crc kubenswrapper[4773]: I1012 21:36:28.670252 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:36:28 crc kubenswrapper[4773]: I1012 21:36:28.670575 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:36:29 crc kubenswrapper[4773]: I1012 21:36:29.411609 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mz7" event={"ID":"7b134f77-c348-44db-aee6-bcb45ec1e9e2","Type":"ContainerStarted","Data":"a522132968203e9187f689162ccc4a4ae8a4643955edc873065e9d7226cc8e1e"} Oct 12 21:36:30 crc kubenswrapper[4773]: I1012 21:36:30.422677 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd8h5" event={"ID":"07a36b0c-00aa-4a84-a735-074d2e448f55","Type":"ContainerStarted","Data":"8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967"} Oct 12 21:36:33 crc kubenswrapper[4773]: I1012 21:36:33.449962 4773 generic.go:334] "Generic (PLEG): container finished" podID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerID="a522132968203e9187f689162ccc4a4ae8a4643955edc873065e9d7226cc8e1e" exitCode=0 Oct 12 21:36:33 crc kubenswrapper[4773]: I1012 21:36:33.450559 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mz7" event={"ID":"7b134f77-c348-44db-aee6-bcb45ec1e9e2","Type":"ContainerDied","Data":"a522132968203e9187f689162ccc4a4ae8a4643955edc873065e9d7226cc8e1e"} Oct 12 21:36:37 crc kubenswrapper[4773]: I1012 21:36:37.495129 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mz7" event={"ID":"7b134f77-c348-44db-aee6-bcb45ec1e9e2","Type":"ContainerStarted","Data":"2d21b2c0bd4cab708fbb4889905bc1676e3a5fa5051148cdacd0d305f3cf67b9"} Oct 12 21:36:38 crc kubenswrapper[4773]: I1012 21:36:38.530971 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z4mz7" podStartSLOduration=8.791891182 podStartE2EDuration="16.53094487s" podCreationTimestamp="2025-10-12 21:36:22 +0000 UTC" firstStartedPulling="2025-10-12 21:36:28.382573783 +0000 UTC m=+4336.618872343" lastFinishedPulling="2025-10-12 21:36:36.121627471 +0000 UTC m=+4344.357926031" observedRunningTime="2025-10-12 21:36:38.523022752 +0000 UTC m=+4346.759321322" watchObservedRunningTime="2025-10-12 21:36:38.53094487 +0000 UTC m=+4346.767243420" Oct 12 21:36:41 crc kubenswrapper[4773]: I1012 21:36:41.544004 4773 generic.go:334] "Generic (PLEG): container finished" podID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerID="8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967" exitCode=0 Oct 12 21:36:41 crc kubenswrapper[4773]: I1012 21:36:41.544703 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd8h5" event={"ID":"07a36b0c-00aa-4a84-a735-074d2e448f55","Type":"ContainerDied","Data":"8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967"} Oct 12 21:36:42 crc kubenswrapper[4773]: I1012 21:36:42.553393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd8h5" event={"ID":"07a36b0c-00aa-4a84-a735-074d2e448f55","Type":"ContainerStarted","Data":"66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc"} Oct 12 21:36:42 crc kubenswrapper[4773]: I1012 21:36:42.579103 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xd8h5" podStartSLOduration=9.706866418 podStartE2EDuration="23.579087355s" podCreationTimestamp="2025-10-12 21:36:19 +0000 UTC" firstStartedPulling="2025-10-12 21:36:28.379813237 +0000 UTC m=+4336.616111797" lastFinishedPulling="2025-10-12 21:36:42.252034174 +0000 UTC m=+4350.488332734" observedRunningTime="2025-10-12 21:36:42.575664781 +0000 UTC m=+4350.811963341" watchObservedRunningTime="2025-10-12 21:36:42.579087355 +0000 UTC m=+4350.815385925" Oct 12 21:36:43 crc kubenswrapper[4773]: I1012 21:36:43.367036 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:43 crc kubenswrapper[4773]: I1012 21:36:43.367332 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:44 crc kubenswrapper[4773]: I1012 21:36:44.411259 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-z4mz7" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerName="registry-server" probeResult="failure" output=< Oct 12 21:36:44 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:36:44 crc kubenswrapper[4773]: > Oct 12 21:36:50 crc kubenswrapper[4773]: I1012 21:36:50.026866 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:50 crc kubenswrapper[4773]: I1012 21:36:50.027478 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:36:51 crc kubenswrapper[4773]: I1012 21:36:51.091921 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xd8h5" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="registry-server" probeResult="failure" output=< Oct 12 21:36:51 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:36:51 crc kubenswrapper[4773]: > Oct 12 21:36:53 crc kubenswrapper[4773]: I1012 21:36:53.417167 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:53 crc kubenswrapper[4773]: I1012 21:36:53.481779 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:53 crc kubenswrapper[4773]: I1012 21:36:53.650779 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z4mz7"] Oct 12 21:36:54 crc kubenswrapper[4773]: I1012 21:36:54.648502 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z4mz7" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerName="registry-server" containerID="cri-o://2d21b2c0bd4cab708fbb4889905bc1676e3a5fa5051148cdacd0d305f3cf67b9" gracePeriod=2 Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.659999 4773 generic.go:334] "Generic (PLEG): container finished" podID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerID="2d21b2c0bd4cab708fbb4889905bc1676e3a5fa5051148cdacd0d305f3cf67b9" exitCode=0 Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.660198 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mz7" event={"ID":"7b134f77-c348-44db-aee6-bcb45ec1e9e2","Type":"ContainerDied","Data":"2d21b2c0bd4cab708fbb4889905bc1676e3a5fa5051148cdacd0d305f3cf67b9"} Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.660309 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mz7" event={"ID":"7b134f77-c348-44db-aee6-bcb45ec1e9e2","Type":"ContainerDied","Data":"d207c0ef343fe1d195cc0ba17da8eca135d5b4865e29e208e5f5e7a2648efde9"} Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.660328 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d207c0ef343fe1d195cc0ba17da8eca135d5b4865e29e208e5f5e7a2648efde9" Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.720382 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.796634 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-catalog-content\") pod \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.796710 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-utilities\") pod \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.796843 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wmlw\" (UniqueName: \"kubernetes.io/projected/7b134f77-c348-44db-aee6-bcb45ec1e9e2-kube-api-access-2wmlw\") pod \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\" (UID: \"7b134f77-c348-44db-aee6-bcb45ec1e9e2\") " Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.797407 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-utilities" (OuterVolumeSpecName: "utilities") pod "7b134f77-c348-44db-aee6-bcb45ec1e9e2" (UID: "7b134f77-c348-44db-aee6-bcb45ec1e9e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.809718 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b134f77-c348-44db-aee6-bcb45ec1e9e2-kube-api-access-2wmlw" (OuterVolumeSpecName: "kube-api-access-2wmlw") pod "7b134f77-c348-44db-aee6-bcb45ec1e9e2" (UID: "7b134f77-c348-44db-aee6-bcb45ec1e9e2"). InnerVolumeSpecName "kube-api-access-2wmlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.860083 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b134f77-c348-44db-aee6-bcb45ec1e9e2" (UID: "7b134f77-c348-44db-aee6-bcb45ec1e9e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.898504 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.898538 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wmlw\" (UniqueName: \"kubernetes.io/projected/7b134f77-c348-44db-aee6-bcb45ec1e9e2-kube-api-access-2wmlw\") on node \"crc\" DevicePath \"\"" Oct 12 21:36:55 crc kubenswrapper[4773]: I1012 21:36:55.898548 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b134f77-c348-44db-aee6-bcb45ec1e9e2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:36:56 crc kubenswrapper[4773]: I1012 21:36:56.669847 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4mz7" Oct 12 21:36:56 crc kubenswrapper[4773]: I1012 21:36:56.692336 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z4mz7"] Oct 12 21:36:56 crc kubenswrapper[4773]: I1012 21:36:56.703240 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z4mz7"] Oct 12 21:36:58 crc kubenswrapper[4773]: I1012 21:36:58.491093 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" path="/var/lib/kubelet/pods/7b134f77-c348-44db-aee6-bcb45ec1e9e2/volumes" Oct 12 21:36:58 crc kubenswrapper[4773]: I1012 21:36:58.669470 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:36:58 crc kubenswrapper[4773]: I1012 21:36:58.669526 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:36:58 crc kubenswrapper[4773]: I1012 21:36:58.669577 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 21:36:58 crc kubenswrapper[4773]: I1012 21:36:58.670582 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 21:36:58 crc kubenswrapper[4773]: I1012 21:36:58.670758 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" gracePeriod=600 Oct 12 21:36:58 crc kubenswrapper[4773]: E1012 21:36:58.910557 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:36:59 crc kubenswrapper[4773]: I1012 21:36:59.695679 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" exitCode=0 Oct 12 21:36:59 crc kubenswrapper[4773]: I1012 21:36:59.695814 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b"} Oct 12 21:36:59 crc kubenswrapper[4773]: I1012 21:36:59.695856 4773 scope.go:117] "RemoveContainer" containerID="8b77d4ed6176397bd48568d9ec09896763cf275ce46d9df90f03ec49dc270753" Oct 12 21:36:59 crc kubenswrapper[4773]: I1012 21:36:59.696556 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:36:59 crc kubenswrapper[4773]: E1012 21:36:59.696958 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:37:01 crc kubenswrapper[4773]: I1012 21:37:01.081034 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xd8h5" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="registry-server" probeResult="failure" output=< Oct 12 21:37:01 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:37:01 crc kubenswrapper[4773]: > Oct 12 21:37:10 crc kubenswrapper[4773]: I1012 21:37:10.074197 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:37:10 crc kubenswrapper[4773]: I1012 21:37:10.131152 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:37:10 crc kubenswrapper[4773]: I1012 21:37:10.317162 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xd8h5"] Oct 12 21:37:11 crc kubenswrapper[4773]: I1012 21:37:11.791780 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xd8h5" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="registry-server" containerID="cri-o://66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc" gracePeriod=2 Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.355887 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.400101 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-catalog-content\") pod \"07a36b0c-00aa-4a84-a735-074d2e448f55\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.400391 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-utilities\") pod \"07a36b0c-00aa-4a84-a735-074d2e448f55\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.400454 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh2sn\" (UniqueName: \"kubernetes.io/projected/07a36b0c-00aa-4a84-a735-074d2e448f55-kube-api-access-kh2sn\") pod \"07a36b0c-00aa-4a84-a735-074d2e448f55\" (UID: \"07a36b0c-00aa-4a84-a735-074d2e448f55\") " Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.402100 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-utilities" (OuterVolumeSpecName: "utilities") pod "07a36b0c-00aa-4a84-a735-074d2e448f55" (UID: "07a36b0c-00aa-4a84-a735-074d2e448f55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.406902 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a36b0c-00aa-4a84-a735-074d2e448f55-kube-api-access-kh2sn" (OuterVolumeSpecName: "kube-api-access-kh2sn") pod "07a36b0c-00aa-4a84-a735-074d2e448f55" (UID: "07a36b0c-00aa-4a84-a735-074d2e448f55"). InnerVolumeSpecName "kube-api-access-kh2sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.483501 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:37:12 crc kubenswrapper[4773]: E1012 21:37:12.484615 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.521315 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07a36b0c-00aa-4a84-a735-074d2e448f55" (UID: "07a36b0c-00aa-4a84-a735-074d2e448f55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.539335 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh2sn\" (UniqueName: \"kubernetes.io/projected/07a36b0c-00aa-4a84-a735-074d2e448f55-kube-api-access-kh2sn\") on node \"crc\" DevicePath \"\"" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.540376 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.540411 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a36b0c-00aa-4a84-a735-074d2e448f55-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.800666 4773 generic.go:334] "Generic (PLEG): container finished" podID="ab77fefe-4fcd-4f10-b46c-f9f4a03687a5" containerID="6c7d2ebb2ed766cfad591a25942f1f2606a61da7cfd06c7cba09af1e105eb07a" exitCode=0 Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.800729 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/crc-debug-fvphg" event={"ID":"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5","Type":"ContainerDied","Data":"6c7d2ebb2ed766cfad591a25942f1f2606a61da7cfd06c7cba09af1e105eb07a"} Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.802966 4773 generic.go:334] "Generic (PLEG): container finished" podID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerID="66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc" exitCode=0 Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.802990 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd8h5" event={"ID":"07a36b0c-00aa-4a84-a735-074d2e448f55","Type":"ContainerDied","Data":"66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc"} Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.803003 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xd8h5" event={"ID":"07a36b0c-00aa-4a84-a735-074d2e448f55","Type":"ContainerDied","Data":"ff4d81fecdfd9496bddaf6cfb2c6c779393265b7f779133b648b00c157bcc064"} Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.803034 4773 scope.go:117] "RemoveContainer" containerID="66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.803132 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xd8h5" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.835274 4773 scope.go:117] "RemoveContainer" containerID="8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967" Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.846908 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xd8h5"] Oct 12 21:37:12 crc kubenswrapper[4773]: I1012 21:37:12.853525 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xd8h5"] Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.231749 4773 scope.go:117] "RemoveContainer" containerID="8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.509781 4773 scope.go:117] "RemoveContainer" containerID="66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc" Oct 12 21:37:13 crc kubenswrapper[4773]: E1012 21:37:13.510532 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc\": container with ID starting with 66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc not found: ID does not exist" containerID="66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.510573 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc"} err="failed to get container status \"66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc\": rpc error: code = NotFound desc = could not find container \"66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc\": container with ID starting with 66e9aeb46d42114f46536f049de8e83fa40410736dd441f819647c823e5275fc not found: ID does not exist" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.510604 4773 scope.go:117] "RemoveContainer" containerID="8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967" Oct 12 21:37:13 crc kubenswrapper[4773]: E1012 21:37:13.511162 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967\": container with ID starting with 8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967 not found: ID does not exist" containerID="8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.511214 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967"} err="failed to get container status \"8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967\": rpc error: code = NotFound desc = could not find container \"8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967\": container with ID starting with 8883c4b000093b8cf816996616a651b39f7443599139955850c41a7c16b59967 not found: ID does not exist" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.511248 4773 scope.go:117] "RemoveContainer" containerID="8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379" Oct 12 21:37:13 crc kubenswrapper[4773]: E1012 21:37:13.512194 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379\": container with ID starting with 8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379 not found: ID does not exist" containerID="8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.512231 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379"} err="failed to get container status \"8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379\": rpc error: code = NotFound desc = could not find container \"8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379\": container with ID starting with 8b96847284db5c2fd669e7b8896e170bad4d47efa35f60ed566bbbd176680379 not found: ID does not exist" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.901845 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.933494 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6fbk8/crc-debug-fvphg"] Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.940395 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6fbk8/crc-debug-fvphg"] Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.966515 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl7jz\" (UniqueName: \"kubernetes.io/projected/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-kube-api-access-dl7jz\") pod \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\" (UID: \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\") " Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.967157 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-host\") pod \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\" (UID: \"ab77fefe-4fcd-4f10-b46c-f9f4a03687a5\") " Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.967325 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-host" (OuterVolumeSpecName: "host") pod "ab77fefe-4fcd-4f10-b46c-f9f4a03687a5" (UID: "ab77fefe-4fcd-4f10-b46c-f9f4a03687a5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.967570 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-host\") on node \"crc\" DevicePath \"\"" Oct 12 21:37:13 crc kubenswrapper[4773]: I1012 21:37:13.973297 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-kube-api-access-dl7jz" (OuterVolumeSpecName: "kube-api-access-dl7jz") pod "ab77fefe-4fcd-4f10-b46c-f9f4a03687a5" (UID: "ab77fefe-4fcd-4f10-b46c-f9f4a03687a5"). InnerVolumeSpecName "kube-api-access-dl7jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:37:14 crc kubenswrapper[4773]: I1012 21:37:14.069638 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl7jz\" (UniqueName: \"kubernetes.io/projected/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5-kube-api-access-dl7jz\") on node \"crc\" DevicePath \"\"" Oct 12 21:37:14 crc kubenswrapper[4773]: I1012 21:37:14.490654 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" path="/var/lib/kubelet/pods/07a36b0c-00aa-4a84-a735-074d2e448f55/volumes" Oct 12 21:37:14 crc kubenswrapper[4773]: I1012 21:37:14.491520 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab77fefe-4fcd-4f10-b46c-f9f4a03687a5" path="/var/lib/kubelet/pods/ab77fefe-4fcd-4f10-b46c-f9f4a03687a5/volumes" Oct 12 21:37:14 crc kubenswrapper[4773]: I1012 21:37:14.825626 4773 scope.go:117] "RemoveContainer" containerID="6c7d2ebb2ed766cfad591a25942f1f2606a61da7cfd06c7cba09af1e105eb07a" Oct 12 21:37:14 crc kubenswrapper[4773]: I1012 21:37:14.825744 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-fvphg" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.245436 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6fbk8/crc-debug-965c9"] Oct 12 21:37:15 crc kubenswrapper[4773]: E1012 21:37:15.246144 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="registry-server" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246160 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="registry-server" Oct 12 21:37:15 crc kubenswrapper[4773]: E1012 21:37:15.246173 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab77fefe-4fcd-4f10-b46c-f9f4a03687a5" containerName="container-00" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246178 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab77fefe-4fcd-4f10-b46c-f9f4a03687a5" containerName="container-00" Oct 12 21:37:15 crc kubenswrapper[4773]: E1012 21:37:15.246191 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerName="extract-utilities" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246198 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerName="extract-utilities" Oct 12 21:37:15 crc kubenswrapper[4773]: E1012 21:37:15.246216 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="extract-content" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246224 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="extract-content" Oct 12 21:37:15 crc kubenswrapper[4773]: E1012 21:37:15.246245 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerName="extract-content" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246250 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerName="extract-content" Oct 12 21:37:15 crc kubenswrapper[4773]: E1012 21:37:15.246263 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="extract-utilities" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246269 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="extract-utilities" Oct 12 21:37:15 crc kubenswrapper[4773]: E1012 21:37:15.246278 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerName="registry-server" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246284 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerName="registry-server" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246448 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b134f77-c348-44db-aee6-bcb45ec1e9e2" containerName="registry-server" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246464 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a36b0c-00aa-4a84-a735-074d2e448f55" containerName="registry-server" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.246479 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab77fefe-4fcd-4f10-b46c-f9f4a03687a5" containerName="container-00" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.247064 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.392787 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b29s\" (UniqueName: \"kubernetes.io/projected/02e41d15-b909-4b31-ba76-eba652857701-kube-api-access-4b29s\") pod \"crc-debug-965c9\" (UID: \"02e41d15-b909-4b31-ba76-eba652857701\") " pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.393129 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02e41d15-b909-4b31-ba76-eba652857701-host\") pod \"crc-debug-965c9\" (UID: \"02e41d15-b909-4b31-ba76-eba652857701\") " pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.495419 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02e41d15-b909-4b31-ba76-eba652857701-host\") pod \"crc-debug-965c9\" (UID: \"02e41d15-b909-4b31-ba76-eba652857701\") " pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.495549 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b29s\" (UniqueName: \"kubernetes.io/projected/02e41d15-b909-4b31-ba76-eba652857701-kube-api-access-4b29s\") pod \"crc-debug-965c9\" (UID: \"02e41d15-b909-4b31-ba76-eba652857701\") " pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.495810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02e41d15-b909-4b31-ba76-eba652857701-host\") pod \"crc-debug-965c9\" (UID: \"02e41d15-b909-4b31-ba76-eba652857701\") " pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.516151 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b29s\" (UniqueName: \"kubernetes.io/projected/02e41d15-b909-4b31-ba76-eba652857701-kube-api-access-4b29s\") pod \"crc-debug-965c9\" (UID: \"02e41d15-b909-4b31-ba76-eba652857701\") " pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.567464 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.835855 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/crc-debug-965c9" event={"ID":"02e41d15-b909-4b31-ba76-eba652857701","Type":"ContainerStarted","Data":"231505d4a372454417470cb3b744a956911bec6d7f48fa7388edf99a0daec05e"} Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.836405 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/crc-debug-965c9" event={"ID":"02e41d15-b909-4b31-ba76-eba652857701","Type":"ContainerStarted","Data":"74a85e5f324222268ca0600d9b29c26eade6d17746402e7b46317886561462cd"} Oct 12 21:37:15 crc kubenswrapper[4773]: I1012 21:37:15.854047 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6fbk8/crc-debug-965c9" podStartSLOduration=0.854028524 podStartE2EDuration="854.028524ms" podCreationTimestamp="2025-10-12 21:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:37:15.848238315 +0000 UTC m=+4384.084536875" watchObservedRunningTime="2025-10-12 21:37:15.854028524 +0000 UTC m=+4384.090327084" Oct 12 21:37:16 crc kubenswrapper[4773]: I1012 21:37:16.845521 4773 generic.go:334] "Generic (PLEG): container finished" podID="02e41d15-b909-4b31-ba76-eba652857701" containerID="231505d4a372454417470cb3b744a956911bec6d7f48fa7388edf99a0daec05e" exitCode=0 Oct 12 21:37:16 crc kubenswrapper[4773]: I1012 21:37:16.845604 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/crc-debug-965c9" event={"ID":"02e41d15-b909-4b31-ba76-eba652857701","Type":"ContainerDied","Data":"231505d4a372454417470cb3b744a956911bec6d7f48fa7388edf99a0daec05e"} Oct 12 21:37:17 crc kubenswrapper[4773]: I1012 21:37:17.979883 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.016477 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6fbk8/crc-debug-965c9"] Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.024649 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6fbk8/crc-debug-965c9"] Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.045802 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b29s\" (UniqueName: \"kubernetes.io/projected/02e41d15-b909-4b31-ba76-eba652857701-kube-api-access-4b29s\") pod \"02e41d15-b909-4b31-ba76-eba652857701\" (UID: \"02e41d15-b909-4b31-ba76-eba652857701\") " Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.045874 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02e41d15-b909-4b31-ba76-eba652857701-host\") pod \"02e41d15-b909-4b31-ba76-eba652857701\" (UID: \"02e41d15-b909-4b31-ba76-eba652857701\") " Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.046055 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e41d15-b909-4b31-ba76-eba652857701-host" (OuterVolumeSpecName: "host") pod "02e41d15-b909-4b31-ba76-eba652857701" (UID: "02e41d15-b909-4b31-ba76-eba652857701"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.046443 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02e41d15-b909-4b31-ba76-eba652857701-host\") on node \"crc\" DevicePath \"\"" Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.051879 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e41d15-b909-4b31-ba76-eba652857701-kube-api-access-4b29s" (OuterVolumeSpecName: "kube-api-access-4b29s") pod "02e41d15-b909-4b31-ba76-eba652857701" (UID: "02e41d15-b909-4b31-ba76-eba652857701"). InnerVolumeSpecName "kube-api-access-4b29s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.147986 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b29s\" (UniqueName: \"kubernetes.io/projected/02e41d15-b909-4b31-ba76-eba652857701-kube-api-access-4b29s\") on node \"crc\" DevicePath \"\"" Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.491246 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e41d15-b909-4b31-ba76-eba652857701" path="/var/lib/kubelet/pods/02e41d15-b909-4b31-ba76-eba652857701/volumes" Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.865903 4773 scope.go:117] "RemoveContainer" containerID="231505d4a372454417470cb3b744a956911bec6d7f48fa7388edf99a0daec05e" Oct 12 21:37:18 crc kubenswrapper[4773]: I1012 21:37:18.866362 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-965c9" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.197358 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6fbk8/crc-debug-8mtnz"] Oct 12 21:37:19 crc kubenswrapper[4773]: E1012 21:37:19.197848 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e41d15-b909-4b31-ba76-eba652857701" containerName="container-00" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.197863 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e41d15-b909-4b31-ba76-eba652857701" containerName="container-00" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.198051 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e41d15-b909-4b31-ba76-eba652857701" containerName="container-00" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.198670 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.268902 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nhpg\" (UniqueName: \"kubernetes.io/projected/ea407521-51bc-4cab-90be-b39729d077c5-kube-api-access-6nhpg\") pod \"crc-debug-8mtnz\" (UID: \"ea407521-51bc-4cab-90be-b39729d077c5\") " pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.269140 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea407521-51bc-4cab-90be-b39729d077c5-host\") pod \"crc-debug-8mtnz\" (UID: \"ea407521-51bc-4cab-90be-b39729d077c5\") " pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.371153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nhpg\" (UniqueName: \"kubernetes.io/projected/ea407521-51bc-4cab-90be-b39729d077c5-kube-api-access-6nhpg\") pod \"crc-debug-8mtnz\" (UID: \"ea407521-51bc-4cab-90be-b39729d077c5\") " pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.371337 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea407521-51bc-4cab-90be-b39729d077c5-host\") pod \"crc-debug-8mtnz\" (UID: \"ea407521-51bc-4cab-90be-b39729d077c5\") " pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.371496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea407521-51bc-4cab-90be-b39729d077c5-host\") pod \"crc-debug-8mtnz\" (UID: \"ea407521-51bc-4cab-90be-b39729d077c5\") " pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.407496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nhpg\" (UniqueName: \"kubernetes.io/projected/ea407521-51bc-4cab-90be-b39729d077c5-kube-api-access-6nhpg\") pod \"crc-debug-8mtnz\" (UID: \"ea407521-51bc-4cab-90be-b39729d077c5\") " pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.513269 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.879939 4773 generic.go:334] "Generic (PLEG): container finished" podID="ea407521-51bc-4cab-90be-b39729d077c5" containerID="2c6648b281f26bdf19c9b9f96b2547bf7c41465d721fdcdd0cd2ed2de343146a" exitCode=0 Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.880115 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" event={"ID":"ea407521-51bc-4cab-90be-b39729d077c5","Type":"ContainerDied","Data":"2c6648b281f26bdf19c9b9f96b2547bf7c41465d721fdcdd0cd2ed2de343146a"} Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.880291 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" event={"ID":"ea407521-51bc-4cab-90be-b39729d077c5","Type":"ContainerStarted","Data":"da9b0dc7dd19ed586c69066d763c421bcdf18ae2196edf94f9769204695a6ad2"} Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.914677 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6fbk8/crc-debug-8mtnz"] Oct 12 21:37:19 crc kubenswrapper[4773]: I1012 21:37:19.932606 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6fbk8/crc-debug-8mtnz"] Oct 12 21:37:21 crc kubenswrapper[4773]: I1012 21:37:21.181978 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:21 crc kubenswrapper[4773]: I1012 21:37:21.318185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nhpg\" (UniqueName: \"kubernetes.io/projected/ea407521-51bc-4cab-90be-b39729d077c5-kube-api-access-6nhpg\") pod \"ea407521-51bc-4cab-90be-b39729d077c5\" (UID: \"ea407521-51bc-4cab-90be-b39729d077c5\") " Oct 12 21:37:21 crc kubenswrapper[4773]: I1012 21:37:21.318663 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea407521-51bc-4cab-90be-b39729d077c5-host\") pod \"ea407521-51bc-4cab-90be-b39729d077c5\" (UID: \"ea407521-51bc-4cab-90be-b39729d077c5\") " Oct 12 21:37:21 crc kubenswrapper[4773]: I1012 21:37:21.318877 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea407521-51bc-4cab-90be-b39729d077c5-host" (OuterVolumeSpecName: "host") pod "ea407521-51bc-4cab-90be-b39729d077c5" (UID: "ea407521-51bc-4cab-90be-b39729d077c5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:37:21 crc kubenswrapper[4773]: I1012 21:37:21.319252 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea407521-51bc-4cab-90be-b39729d077c5-host\") on node \"crc\" DevicePath \"\"" Oct 12 21:37:21 crc kubenswrapper[4773]: I1012 21:37:21.323856 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea407521-51bc-4cab-90be-b39729d077c5-kube-api-access-6nhpg" (OuterVolumeSpecName: "kube-api-access-6nhpg") pod "ea407521-51bc-4cab-90be-b39729d077c5" (UID: "ea407521-51bc-4cab-90be-b39729d077c5"). InnerVolumeSpecName "kube-api-access-6nhpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:37:21 crc kubenswrapper[4773]: I1012 21:37:21.420981 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nhpg\" (UniqueName: \"kubernetes.io/projected/ea407521-51bc-4cab-90be-b39729d077c5-kube-api-access-6nhpg\") on node \"crc\" DevicePath \"\"" Oct 12 21:37:21 crc kubenswrapper[4773]: I1012 21:37:21.897787 4773 scope.go:117] "RemoveContainer" containerID="2c6648b281f26bdf19c9b9f96b2547bf7c41465d721fdcdd0cd2ed2de343146a" Oct 12 21:37:21 crc kubenswrapper[4773]: I1012 21:37:21.897862 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/crc-debug-8mtnz" Oct 12 21:37:22 crc kubenswrapper[4773]: I1012 21:37:22.494246 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea407521-51bc-4cab-90be-b39729d077c5" path="/var/lib/kubelet/pods/ea407521-51bc-4cab-90be-b39729d077c5/volumes" Oct 12 21:37:25 crc kubenswrapper[4773]: I1012 21:37:25.481435 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:37:25 crc kubenswrapper[4773]: E1012 21:37:25.482030 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:37:39 crc kubenswrapper[4773]: I1012 21:37:39.481208 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:37:39 crc kubenswrapper[4773]: E1012 21:37:39.482331 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:37:54 crc kubenswrapper[4773]: I1012 21:37:54.480778 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:37:54 crc kubenswrapper[4773]: E1012 21:37:54.481607 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:37:54 crc kubenswrapper[4773]: I1012 21:37:54.860176 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b76446c56-rrfn7_44baa955-b25d-4648-aef5-423ad5992301/barbican-api-log/0.log" Oct 12 21:37:54 crc kubenswrapper[4773]: I1012 21:37:54.915300 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b76446c56-rrfn7_44baa955-b25d-4648-aef5-423ad5992301/barbican-api/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.029287 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b8cc45894-xq876_1fa331c5-06f0-4fac-b997-c68390b26f62/barbican-keystone-listener/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.123053 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b8cc45894-xq876_1fa331c5-06f0-4fac-b997-c68390b26f62/barbican-keystone-listener-log/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.244449 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5bccc98b47-7pq24_bccbf811-29d3-4a21-856b-4ae1cfb29c74/barbican-worker/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.327768 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5bccc98b47-7pq24_bccbf811-29d3-4a21-856b-4ae1cfb29c74/barbican-worker-log/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.505224 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2_25b3f977-6673-4aa8-aadc-89d98ceb7638/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.642241 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d/ceilometer-central-agent/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.703780 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d/ceilometer-notification-agent/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.718360 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d/proxy-httpd/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.840162 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d/sg-core/0.log" Oct 12 21:37:55 crc kubenswrapper[4773]: I1012 21:37:55.982516 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd_f7d6457c-5706-4a38-b0ef-24cc906b7cab/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:37:56 crc kubenswrapper[4773]: I1012 21:37:56.145655 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q_9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:37:56 crc kubenswrapper[4773]: I1012 21:37:56.345611 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_47b1d281-5528-455b-8b30-b636772d29ce/cinder-api/0.log" Oct 12 21:37:56 crc kubenswrapper[4773]: I1012 21:37:56.367419 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_47b1d281-5528-455b-8b30-b636772d29ce/cinder-api-log/0.log" Oct 12 21:37:56 crc kubenswrapper[4773]: I1012 21:37:56.640153 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_716b9576-d48b-4720-9fb4-73f6744adee5/probe/0.log" Oct 12 21:37:56 crc kubenswrapper[4773]: I1012 21:37:56.649554 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_716b9576-d48b-4720-9fb4-73f6744adee5/cinder-backup/0.log" Oct 12 21:37:57 crc kubenswrapper[4773]: I1012 21:37:57.155186 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_32308982-4e4e-4ca5-98d8-b173e22fa341/probe/0.log" Oct 12 21:37:57 crc kubenswrapper[4773]: I1012 21:37:57.156371 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_32308982-4e4e-4ca5-98d8-b173e22fa341/cinder-scheduler/0.log" Oct 12 21:37:57 crc kubenswrapper[4773]: I1012 21:37:57.425578 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c3963233-e9ff-4f92-a94a-5b99835ab607/cinder-volume/0.log" Oct 12 21:37:57 crc kubenswrapper[4773]: I1012 21:37:57.478471 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c3963233-e9ff-4f92-a94a-5b99835ab607/probe/0.log" Oct 12 21:37:57 crc kubenswrapper[4773]: I1012 21:37:57.715467 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xzgst_dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:37:57 crc kubenswrapper[4773]: I1012 21:37:57.846740 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-f77w9_17284681-e0c1-42f8-8ee2-2b3f8e73e6d1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:37:58 crc kubenswrapper[4773]: I1012 21:37:58.050188 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7887c4559f-fs5qk_6875c763-6837-4c47-8738-b66b6d4e6306/init/0.log" Oct 12 21:37:58 crc kubenswrapper[4773]: I1012 21:37:58.154281 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7887c4559f-fs5qk_6875c763-6837-4c47-8738-b66b6d4e6306/init/0.log" Oct 12 21:37:58 crc kubenswrapper[4773]: I1012 21:37:58.309763 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7887c4559f-fs5qk_6875c763-6837-4c47-8738-b66b6d4e6306/dnsmasq-dns/0.log" Oct 12 21:37:58 crc kubenswrapper[4773]: I1012 21:37:58.331214 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_6225801c-d77f-493c-834a-1393a8a1d239/glance-httpd/0.log" Oct 12 21:37:58 crc kubenswrapper[4773]: I1012 21:37:58.414864 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_6225801c-d77f-493c-834a-1393a8a1d239/glance-log/0.log" Oct 12 21:37:58 crc kubenswrapper[4773]: I1012 21:37:58.911292 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1e607bc3-d77b-4dfb-a697-911f6dea3244/glance-httpd/0.log" Oct 12 21:37:58 crc kubenswrapper[4773]: I1012 21:37:58.945936 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1e607bc3-d77b-4dfb-a697-911f6dea3244/glance-log/0.log" Oct 12 21:37:59 crc kubenswrapper[4773]: I1012 21:37:59.180489 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f5486cbb4-g66c2_7f878004-f437-4db3-a695-09d92a0bc6e4/horizon/0.log" Oct 12 21:37:59 crc kubenswrapper[4773]: I1012 21:37:59.281441 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk_23eb0d3e-06b9-4b1e-b493-27d00d4f34f4/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:37:59 crc kubenswrapper[4773]: I1012 21:37:59.307415 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f5486cbb4-g66c2_7f878004-f437-4db3-a695-09d92a0bc6e4/horizon-log/0.log" Oct 12 21:37:59 crc kubenswrapper[4773]: I1012 21:37:59.459343 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-fk9wz_4d530e38-f79d-4b93-9d2a-ad94eddb69b1/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:37:59 crc kubenswrapper[4773]: I1012 21:37:59.652098 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d9b9d6b96-hvhdj_addfad9c-82e3-4f44-883e-c88e44a3641d/keystone-api/0.log" Oct 12 21:37:59 crc kubenswrapper[4773]: I1012 21:37:59.704079 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29338381-pxljd_d2812224-3ef8-431f-896d-01d9d78c3650/keystone-cron/0.log" Oct 12 21:37:59 crc kubenswrapper[4773]: I1012 21:37:59.825422 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5c04ed55-a588-4a57-9f14-90fca8e2dab0/kube-state-metrics/0.log" Oct 12 21:38:00 crc kubenswrapper[4773]: I1012 21:38:00.801272 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v_e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:38:00 crc kubenswrapper[4773]: I1012 21:38:00.841506 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_06283d24-b053-4893-9dab-4bfe5daf18b1/manila-api-log/0.log" Oct 12 21:38:00 crc kubenswrapper[4773]: I1012 21:38:00.870940 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_06283d24-b053-4893-9dab-4bfe5daf18b1/manila-api/0.log" Oct 12 21:38:01 crc kubenswrapper[4773]: I1012 21:38:01.061183 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_dfbdaab1-e327-4291-a585-829aa6b81f00/probe/0.log" Oct 12 21:38:01 crc kubenswrapper[4773]: I1012 21:38:01.066999 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_dfbdaab1-e327-4291-a585-829aa6b81f00/manila-scheduler/0.log" Oct 12 21:38:01 crc kubenswrapper[4773]: I1012 21:38:01.246616 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_5db237cf-3d2a-48e1-bf07-a92ae2d96139/probe/0.log" Oct 12 21:38:01 crc kubenswrapper[4773]: I1012 21:38:01.282469 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_5db237cf-3d2a-48e1-bf07-a92ae2d96139/manila-share/0.log" Oct 12 21:38:01 crc kubenswrapper[4773]: I1012 21:38:01.618036 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7857d9f9fc-69hqj_86be19ef-4d97-4e17-bfbc-3c9c8153cd76/neutron-api/0.log" Oct 12 21:38:01 crc kubenswrapper[4773]: I1012 21:38:01.625626 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7857d9f9fc-69hqj_86be19ef-4d97-4e17-bfbc-3c9c8153cd76/neutron-httpd/0.log" Oct 12 21:38:01 crc kubenswrapper[4773]: I1012 21:38:01.868561 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb_0f130afc-51e5-494f-b915-7ec573c760b1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:38:02 crc kubenswrapper[4773]: I1012 21:38:02.425431 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8163c65f-b48b-4fd5-b7c1-12d94abfa723/nova-api-log/0.log" Oct 12 21:38:02 crc kubenswrapper[4773]: I1012 21:38:02.649347 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_6698b6b5-2a0c-45d1-a3dc-ea58147105dc/nova-cell0-conductor-conductor/0.log" Oct 12 21:38:02 crc kubenswrapper[4773]: I1012 21:38:02.694279 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8163c65f-b48b-4fd5-b7c1-12d94abfa723/nova-api-api/0.log" Oct 12 21:38:02 crc kubenswrapper[4773]: I1012 21:38:02.976014 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5838c3b9-38bc-4a97-bcbc-3a734b6b230f/nova-cell1-conductor-conductor/0.log" Oct 12 21:38:03 crc kubenswrapper[4773]: I1012 21:38:03.051685 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4f9f8baf-3f94-4e6c-b5ec-f9763330a042/nova-cell1-novncproxy-novncproxy/0.log" Oct 12 21:38:03 crc kubenswrapper[4773]: I1012 21:38:03.326100 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq_f216a3f9-57a7-4084-b8c1-ed07cd69d4ac/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:38:03 crc kubenswrapper[4773]: I1012 21:38:03.523582 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_74ec9771-6918-4102-abd1-7b9130f91a4d/nova-metadata-log/0.log" Oct 12 21:38:04 crc kubenswrapper[4773]: I1012 21:38:04.000482 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bd4ee762-fd56-496f-860b-89201215948c/nova-scheduler-scheduler/0.log" Oct 12 21:38:04 crc kubenswrapper[4773]: I1012 21:38:04.225136 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7566f11c-8e52-4fb6-b1a2-98b388ffefd9/mysql-bootstrap/0.log" Oct 12 21:38:04 crc kubenswrapper[4773]: I1012 21:38:04.383642 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7566f11c-8e52-4fb6-b1a2-98b388ffefd9/mysql-bootstrap/0.log" Oct 12 21:38:04 crc kubenswrapper[4773]: I1012 21:38:04.476475 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7566f11c-8e52-4fb6-b1a2-98b388ffefd9/galera/0.log" Oct 12 21:38:04 crc kubenswrapper[4773]: I1012 21:38:04.880086 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0cb52152-8a83-4122-be94-0d2803fd5cc7/mysql-bootstrap/0.log" Oct 12 21:38:05 crc kubenswrapper[4773]: I1012 21:38:05.010821 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0cb52152-8a83-4122-be94-0d2803fd5cc7/mysql-bootstrap/0.log" Oct 12 21:38:05 crc kubenswrapper[4773]: I1012 21:38:05.154300 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0cb52152-8a83-4122-be94-0d2803fd5cc7/galera/0.log" Oct 12 21:38:05 crc kubenswrapper[4773]: I1012 21:38:05.301580 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_74ec9771-6918-4102-abd1-7b9130f91a4d/nova-metadata-metadata/0.log" Oct 12 21:38:05 crc kubenswrapper[4773]: I1012 21:38:05.387898 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3028de9d-aaa8-4c46-9cbb-a4ab147bf458/openstackclient/0.log" Oct 12 21:38:05 crc kubenswrapper[4773]: I1012 21:38:05.574300 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7kldg_b57600c2-89e9-4db4-a846-48235987e13c/openstack-network-exporter/0.log" Oct 12 21:38:06 crc kubenswrapper[4773]: I1012 21:38:06.095690 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wfpwq_55ef70a8-016d-403f-ab02-820088160f9c/ovsdb-server-init/0.log" Oct 12 21:38:06 crc kubenswrapper[4773]: I1012 21:38:06.434626 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wfpwq_55ef70a8-016d-403f-ab02-820088160f9c/ovsdb-server-init/0.log" Oct 12 21:38:06 crc kubenswrapper[4773]: I1012 21:38:06.480552 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:38:06 crc kubenswrapper[4773]: E1012 21:38:06.480893 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:38:06 crc kubenswrapper[4773]: I1012 21:38:06.497424 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wfpwq_55ef70a8-016d-403f-ab02-820088160f9c/ovsdb-server/0.log" Oct 12 21:38:06 crc kubenswrapper[4773]: I1012 21:38:06.538832 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wfpwq_55ef70a8-016d-403f-ab02-820088160f9c/ovs-vswitchd/0.log" Oct 12 21:38:07 crc kubenswrapper[4773]: I1012 21:38:07.227309 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sf74r_1a08bcbe-fa8c-43b2-a4fb-ae2212de940d/ovn-controller/0.log" Oct 12 21:38:07 crc kubenswrapper[4773]: I1012 21:38:07.436179 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dlkwd_62f86eec-8f45-4449-a363-cb195f58abbd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:38:07 crc kubenswrapper[4773]: I1012 21:38:07.467425 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4a50ca31-4c77-488f-aed7-aa99e82677f0/openstack-network-exporter/0.log" Oct 12 21:38:07 crc kubenswrapper[4773]: I1012 21:38:07.598550 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4a50ca31-4c77-488f-aed7-aa99e82677f0/ovn-northd/0.log" Oct 12 21:38:07 crc kubenswrapper[4773]: I1012 21:38:07.697388 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc85890c-cde2-470e-87de-4d69f1682bd0/openstack-network-exporter/0.log" Oct 12 21:38:07 crc kubenswrapper[4773]: I1012 21:38:07.766251 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc85890c-cde2-470e-87de-4d69f1682bd0/ovsdbserver-nb/0.log" Oct 12 21:38:07 crc kubenswrapper[4773]: I1012 21:38:07.944214 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_21606d72-32b0-4552-ac26-df0425f03cdf/openstack-network-exporter/0.log" Oct 12 21:38:07 crc kubenswrapper[4773]: I1012 21:38:07.951599 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_21606d72-32b0-4552-ac26-df0425f03cdf/ovsdbserver-sb/0.log" Oct 12 21:38:08 crc kubenswrapper[4773]: I1012 21:38:08.274476 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d7c87b9bb-vwlxb_c9a14159-b8fe-40c9-b7ac-6c410c02a0ab/placement-api/0.log" Oct 12 21:38:08 crc kubenswrapper[4773]: I1012 21:38:08.374130 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d7c87b9bb-vwlxb_c9a14159-b8fe-40c9-b7ac-6c410c02a0ab/placement-log/0.log" Oct 12 21:38:08 crc kubenswrapper[4773]: I1012 21:38:08.841958 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c6bb2e3-2f0e-499a-b349-07ea3eb7190d/setup-container/0.log" Oct 12 21:38:09 crc kubenswrapper[4773]: I1012 21:38:09.103063 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c6bb2e3-2f0e-499a-b349-07ea3eb7190d/setup-container/0.log" Oct 12 21:38:09 crc kubenswrapper[4773]: I1012 21:38:09.186624 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0b0fae69-d926-472c-a222-3a98f25a1e14/setup-container/0.log" Oct 12 21:38:09 crc kubenswrapper[4773]: I1012 21:38:09.225155 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c6bb2e3-2f0e-499a-b349-07ea3eb7190d/rabbitmq/0.log" Oct 12 21:38:09 crc kubenswrapper[4773]: I1012 21:38:09.461120 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0b0fae69-d926-472c-a222-3a98f25a1e14/rabbitmq/0.log" Oct 12 21:38:09 crc kubenswrapper[4773]: I1012 21:38:09.527871 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ch565_073e807a-4708-4b50-abf6-f66668e13e8e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:38:09 crc kubenswrapper[4773]: I1012 21:38:09.578532 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0b0fae69-d926-472c-a222-3a98f25a1e14/setup-container/0.log" Oct 12 21:38:10 crc kubenswrapper[4773]: I1012 21:38:10.176820 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p_d3045d7b-b25d-4036-bea1-0b5f184476eb/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:38:10 crc kubenswrapper[4773]: I1012 21:38:10.297918 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wptgl_83cb532a-174c-41c0-a271-95a66d439f0c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:38:10 crc kubenswrapper[4773]: I1012 21:38:10.546694 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-sdsnz_8c71487d-25fd-480c-90ca-4ca43f86a247/ssh-known-hosts-edpm-deployment/0.log" Oct 12 21:38:10 crc kubenswrapper[4773]: I1012 21:38:10.574155 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345/tempest-tests-tempest-tests-runner/0.log" Oct 12 21:38:10 crc kubenswrapper[4773]: I1012 21:38:10.819794 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b8b33e95-51a4-42fe-a706-fc146ef7ce27/test-operator-logs-container/0.log" Oct 12 21:38:10 crc kubenswrapper[4773]: I1012 21:38:10.893813 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j_9ef8a23e-6501-4e90-a51c-0d57cee847af/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:38:18 crc kubenswrapper[4773]: I1012 21:38:18.482408 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:38:18 crc kubenswrapper[4773]: E1012 21:38:18.483043 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:38:22 crc kubenswrapper[4773]: I1012 21:38:22.100276 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8aeb19a9-db56-488b-9410-004f24e8d11a/memcached/0.log" Oct 12 21:38:32 crc kubenswrapper[4773]: I1012 21:38:32.491655 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:38:32 crc kubenswrapper[4773]: E1012 21:38:32.492429 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:38:38 crc kubenswrapper[4773]: I1012 21:38:38.898843 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-rqvz2_ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a/kube-rbac-proxy/0.log" Oct 12 21:38:38 crc kubenswrapper[4773]: I1012 21:38:38.964578 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-rqvz2_ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a/manager/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.121615 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/util/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.264499 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/util/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.302220 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/pull/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.335770 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/pull/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.528445 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/util/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.529489 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/extract/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.529674 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/pull/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.731898 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-4wwwj_e3a81848-dc85-44b3-addf-35cb34c1e85a/kube-rbac-proxy/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.749207 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-4wwwj_e3a81848-dc85-44b3-addf-35cb34c1e85a/manager/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.814066 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-sfgw7_69dd4207-8b02-4a43-bc3a-9c939881422f/kube-rbac-proxy/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.935916 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-sfgw7_69dd4207-8b02-4a43-bc3a-9c939881422f/manager/0.log" Oct 12 21:38:39 crc kubenswrapper[4773]: I1012 21:38:39.998034 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-llrqr_3bff7ce4-adb2-494b-8644-f8e7568efa62/kube-rbac-proxy/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.098119 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-llrqr_3bff7ce4-adb2-494b-8644-f8e7568efa62/manager/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.162194 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-xnqzt_e963f42c-7955-4378-927e-1ab264a6116e/kube-rbac-proxy/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.230962 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-xnqzt_e963f42c-7955-4378-927e-1ab264a6116e/manager/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.348606 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-shp22_9392e042-5a5f-47d2-9232-3fa47cce88f3/kube-rbac-proxy/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.397457 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-shp22_9392e042-5a5f-47d2-9232-3fa47cce88f3/manager/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.554383 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-8j4jx_6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17/kube-rbac-proxy/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.719321 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-8j4jx_6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17/manager/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.738063 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-fqqwc_83700a3c-4ccd-4ac6-8c0a-c530623ffdfe/kube-rbac-proxy/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.823575 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-fqqwc_83700a3c-4ccd-4ac6-8c0a-c530623ffdfe/manager/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.877369 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-sc6z2_5321f2fd-a14c-4a48-be68-bdbefe80aa8d/kube-rbac-proxy/0.log" Oct 12 21:38:40 crc kubenswrapper[4773]: I1012 21:38:40.995332 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-sc6z2_5321f2fd-a14c-4a48-be68-bdbefe80aa8d/manager/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.179955 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-jh557_2e680c12-2026-4296-8ffa-d0185c12d2c1/kube-rbac-proxy/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.208487 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-jh557_2e680c12-2026-4296-8ffa-d0185c12d2c1/manager/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.336432 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-dtppq_843b5d05-f35d-4632-8781-4c60ed803cb6/kube-rbac-proxy/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.443919 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-b6kht_7dc3b970-233d-4af3-a341-8297af5433bc/kube-rbac-proxy/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.516707 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-dtppq_843b5d05-f35d-4632-8781-4c60ed803cb6/manager/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.595293 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-b6kht_7dc3b970-233d-4af3-a341-8297af5433bc/manager/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.725299 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-pbmbc_f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b/kube-rbac-proxy/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.836363 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-pbmbc_f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b/manager/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.886758 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-nvvg8_095b027c-fb46-4d19-bbcf-84871f8c90f7/kube-rbac-proxy/0.log" Oct 12 21:38:41 crc kubenswrapper[4773]: I1012 21:38:41.947749 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-nvvg8_095b027c-fb46-4d19-bbcf-84871f8c90f7/manager/0.log" Oct 12 21:38:42 crc kubenswrapper[4773]: I1012 21:38:42.090876 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf_b9e5880b-293d-4311-8928-f93649649c93/kube-rbac-proxy/0.log" Oct 12 21:38:42 crc kubenswrapper[4773]: I1012 21:38:42.127089 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf_b9e5880b-293d-4311-8928-f93649649c93/manager/0.log" Oct 12 21:38:42 crc kubenswrapper[4773]: I1012 21:38:42.833551 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-xgrzm_094825fc-aaad-4717-9d34-426f1f3fa63f/kube-rbac-proxy/0.log" Oct 12 21:38:42 crc kubenswrapper[4773]: I1012 21:38:42.895980 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-qgbcw_96a44ad1-ead6-4c4d-be23-622d643a0bf0/kube-rbac-proxy/0.log" Oct 12 21:38:43 crc kubenswrapper[4773]: I1012 21:38:43.112913 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-qgbcw_96a44ad1-ead6-4c4d-be23-622d643a0bf0/operator/0.log" Oct 12 21:38:43 crc kubenswrapper[4773]: I1012 21:38:43.240886 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5p2sl_237d38ea-2958-4510-a3e3-20b37bf0814d/registry-server/0.log" Oct 12 21:38:43 crc kubenswrapper[4773]: I1012 21:38:43.353037 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-rczp5_34c81f6e-1829-4f0f-a0aa-951b4d4f41c4/kube-rbac-proxy/0.log" Oct 12 21:38:43 crc kubenswrapper[4773]: I1012 21:38:43.498419 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-rczp5_34c81f6e-1829-4f0f-a0aa-951b4d4f41c4/manager/0.log" Oct 12 21:38:43 crc kubenswrapper[4773]: I1012 21:38:43.546871 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-thj2w_2b083bd3-8fe4-44c8-8d3e-f736260b8210/kube-rbac-proxy/0.log" Oct 12 21:38:43 crc kubenswrapper[4773]: I1012 21:38:43.750056 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-thj2w_2b083bd3-8fe4-44c8-8d3e-f736260b8210/manager/0.log" Oct 12 21:38:43 crc kubenswrapper[4773]: I1012 21:38:43.925055 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm_f7349c73-c56a-4e87-8618-dea521d99b95/operator/0.log" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.067354 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-xgrzm_094825fc-aaad-4717-9d34-426f1f3fa63f/manager/0.log" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.174193 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-vx9cr_b2ec8f8f-d841-4683-86ed-54ec360d9ec1/kube-rbac-proxy/0.log" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.285701 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-vx9cr_b2ec8f8f-d841-4683-86ed-54ec360d9ec1/manager/0.log" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.334028 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-bcdc4_293153be-33db-41ba-a589-55a17026c756/kube-rbac-proxy/0.log" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.481008 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:38:44 crc kubenswrapper[4773]: E1012 21:38:44.481259 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.484728 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-bcdc4_293153be-33db-41ba-a589-55a17026c756/manager/0.log" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.554660 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-hz6mw_e38708a6-e3b7-407d-8fe5-f27cd9a69f76/manager/0.log" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.599499 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-hz6mw_e38708a6-e3b7-407d-8fe5-f27cd9a69f76/kube-rbac-proxy/0.log" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.782325 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-4dsfj_38cef8bd-b25e-47aa-8f3f-9af1289f72f8/kube-rbac-proxy/0.log" Oct 12 21:38:44 crc kubenswrapper[4773]: I1012 21:38:44.791724 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-4dsfj_38cef8bd-b25e-47aa-8f3f-9af1289f72f8/manager/0.log" Oct 12 21:38:56 crc kubenswrapper[4773]: I1012 21:38:56.480901 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:38:56 crc kubenswrapper[4773]: E1012 21:38:56.481611 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:39:00 crc kubenswrapper[4773]: I1012 21:39:00.576475 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-89dsq_fa851b59-ffb3-46c4-a61e-31f85d43eb7a/control-plane-machine-set-operator/0.log" Oct 12 21:39:00 crc kubenswrapper[4773]: I1012 21:39:00.722934 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmsrw_bf381381-f5d3-4217-8a9c-cf527e2c6c65/kube-rbac-proxy/0.log" Oct 12 21:39:00 crc kubenswrapper[4773]: I1012 21:39:00.743654 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmsrw_bf381381-f5d3-4217-8a9c-cf527e2c6c65/machine-api-operator/0.log" Oct 12 21:39:07 crc kubenswrapper[4773]: I1012 21:39:07.480896 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:39:07 crc kubenswrapper[4773]: E1012 21:39:07.481636 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:39:14 crc kubenswrapper[4773]: I1012 21:39:14.052319 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qbzg6_6003117d-518b-4b81-98ba-01ffbdea09c7/cert-manager-controller/0.log" Oct 12 21:39:14 crc kubenswrapper[4773]: I1012 21:39:14.192036 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-lshqf_5c1610be-cf14-4659-8bf8-46cbcb55aa47/cert-manager-cainjector/0.log" Oct 12 21:39:14 crc kubenswrapper[4773]: I1012 21:39:14.216065 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-gh5rv_882eaacb-03d9-4250-ab13-b702c4f4b91c/cert-manager-webhook/0.log" Oct 12 21:39:19 crc kubenswrapper[4773]: I1012 21:39:19.481654 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:39:19 crc kubenswrapper[4773]: E1012 21:39:19.482413 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:39:26 crc kubenswrapper[4773]: I1012 21:39:26.759789 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-77gl5_476898a0-6b77-4b46-8a73-1a0fa1e336c8/nmstate-console-plugin/0.log" Oct 12 21:39:26 crc kubenswrapper[4773]: I1012 21:39:26.877263 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gpbvq_04007580-35e5-42d5-84ec-1e44c4d6d914/nmstate-handler/0.log" Oct 12 21:39:26 crc kubenswrapper[4773]: I1012 21:39:26.909873 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-8mnws_751cb256-7079-497a-a027-a9c295bc9832/kube-rbac-proxy/0.log" Oct 12 21:39:26 crc kubenswrapper[4773]: I1012 21:39:26.983680 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-8mnws_751cb256-7079-497a-a027-a9c295bc9832/nmstate-metrics/0.log" Oct 12 21:39:27 crc kubenswrapper[4773]: I1012 21:39:27.074989 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-5qz2w_c290e672-35df-4626-8034-095052214269/nmstate-operator/0.log" Oct 12 21:39:27 crc kubenswrapper[4773]: I1012 21:39:27.189678 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-wzw64_fdf1901d-c523-4385-9415-fae96f1ea74c/nmstate-webhook/0.log" Oct 12 21:39:34 crc kubenswrapper[4773]: I1012 21:39:34.481696 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:39:34 crc kubenswrapper[4773]: E1012 21:39:34.482419 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.131762 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-mcbq6_9c120d38-3572-486b-9b37-946d2358e130/kube-rbac-proxy/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.346067 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-mcbq6_9c120d38-3572-486b-9b37-946d2358e130/controller/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.388744 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-frr-files/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.554069 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-reloader/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.564420 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-reloader/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.637278 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-metrics/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.646462 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-frr-files/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.893568 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-reloader/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.920405 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-frr-files/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.940595 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-metrics/0.log" Oct 12 21:39:42 crc kubenswrapper[4773]: I1012 21:39:42.962081 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-metrics/0.log" Oct 12 21:39:43 crc kubenswrapper[4773]: I1012 21:39:43.142216 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-reloader/0.log" Oct 12 21:39:43 crc kubenswrapper[4773]: I1012 21:39:43.164113 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-frr-files/0.log" Oct 12 21:39:43 crc kubenswrapper[4773]: I1012 21:39:43.189172 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-metrics/0.log" Oct 12 21:39:43 crc kubenswrapper[4773]: I1012 21:39:43.297459 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/controller/0.log" Oct 12 21:39:43 crc kubenswrapper[4773]: I1012 21:39:43.368769 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/frr-metrics/0.log" Oct 12 21:39:43 crc kubenswrapper[4773]: I1012 21:39:43.497375 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/kube-rbac-proxy/0.log" Oct 12 21:39:43 crc kubenswrapper[4773]: I1012 21:39:43.684778 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/kube-rbac-proxy-frr/0.log" Oct 12 21:39:43 crc kubenswrapper[4773]: I1012 21:39:43.688638 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/reloader/0.log" Oct 12 21:39:43 crc kubenswrapper[4773]: I1012 21:39:43.901644 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-sq7f5_aba7a037-467a-40bd-b2e5-4c446be76185/frr-k8s-webhook-server/0.log" Oct 12 21:39:44 crc kubenswrapper[4773]: I1012 21:39:44.224239 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d86f779f8-r94wm_774b15ab-55ba-42a6-8a77-13690e6aa683/manager/0.log" Oct 12 21:39:44 crc kubenswrapper[4773]: I1012 21:39:44.386657 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d8b4c7c74-pbqqx_180f9b25-f871-4854-b535-73fd6bd1d7f0/webhook-server/0.log" Oct 12 21:39:44 crc kubenswrapper[4773]: I1012 21:39:44.563168 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-df4jg_5c70a42a-d5f5-4b1d-b23b-cd672597789c/kube-rbac-proxy/0.log" Oct 12 21:39:44 crc kubenswrapper[4773]: I1012 21:39:44.699695 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/frr/0.log" Oct 12 21:39:45 crc kubenswrapper[4773]: I1012 21:39:45.137950 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-df4jg_5c70a42a-d5f5-4b1d-b23b-cd672597789c/speaker/0.log" Oct 12 21:39:47 crc kubenswrapper[4773]: I1012 21:39:47.480866 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:39:47 crc kubenswrapper[4773]: E1012 21:39:47.481680 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.110912 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/util/0.log" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.200986 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/util/0.log" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.236300 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/pull/0.log" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.255745 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/pull/0.log" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.447130 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/util/0.log" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.464706 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/pull/0.log" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.528605 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/extract/0.log" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.651083 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-utilities/0.log" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.874965 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-content/0.log" Oct 12 21:40:00 crc kubenswrapper[4773]: I1012 21:40:00.923947 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-utilities/0.log" Oct 12 21:40:01 crc kubenswrapper[4773]: I1012 21:40:01.178616 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-utilities/0.log" Oct 12 21:40:01 crc kubenswrapper[4773]: I1012 21:40:01.201202 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-content/0.log" Oct 12 21:40:01 crc kubenswrapper[4773]: I1012 21:40:01.216082 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-content/0.log" Oct 12 21:40:01 crc kubenswrapper[4773]: I1012 21:40:01.480622 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:40:01 crc kubenswrapper[4773]: E1012 21:40:01.480966 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:40:01 crc kubenswrapper[4773]: I1012 21:40:01.517235 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-utilities/0.log" Oct 12 21:40:01 crc kubenswrapper[4773]: I1012 21:40:01.715435 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-content/0.log" Oct 12 21:40:01 crc kubenswrapper[4773]: I1012 21:40:01.725819 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-utilities/0.log" Oct 12 21:40:01 crc kubenswrapper[4773]: I1012 21:40:01.773687 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/registry-server/0.log" Oct 12 21:40:01 crc kubenswrapper[4773]: I1012 21:40:01.784701 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-content/0.log" Oct 12 21:40:02 crc kubenswrapper[4773]: I1012 21:40:02.065523 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-content/0.log" Oct 12 21:40:02 crc kubenswrapper[4773]: I1012 21:40:02.069205 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-utilities/0.log" Oct 12 21:40:02 crc kubenswrapper[4773]: I1012 21:40:02.379427 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/util/0.log" Oct 12 21:40:02 crc kubenswrapper[4773]: I1012 21:40:02.577658 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/registry-server/0.log" Oct 12 21:40:02 crc kubenswrapper[4773]: I1012 21:40:02.752538 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/util/0.log" Oct 12 21:40:02 crc kubenswrapper[4773]: I1012 21:40:02.754248 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/pull/0.log" Oct 12 21:40:02 crc kubenswrapper[4773]: I1012 21:40:02.799064 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/pull/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.015762 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/extract/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.019482 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/util/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.054655 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/pull/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.251510 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2v2pc_3f993aa7-e2c9-41bb-96ba-0b4e0682c92a/marketplace-operator/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.354309 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-utilities/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.514380 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-content/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.555076 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-utilities/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.582389 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-content/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.735226 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-content/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.804012 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-utilities/0.log" Oct 12 21:40:03 crc kubenswrapper[4773]: I1012 21:40:03.984816 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/registry-server/0.log" Oct 12 21:40:04 crc kubenswrapper[4773]: I1012 21:40:04.009495 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-utilities/0.log" Oct 12 21:40:04 crc kubenswrapper[4773]: I1012 21:40:04.176646 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-utilities/0.log" Oct 12 21:40:04 crc kubenswrapper[4773]: I1012 21:40:04.176857 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-content/0.log" Oct 12 21:40:04 crc kubenswrapper[4773]: I1012 21:40:04.219035 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-content/0.log" Oct 12 21:40:04 crc kubenswrapper[4773]: I1012 21:40:04.487013 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-utilities/0.log" Oct 12 21:40:04 crc kubenswrapper[4773]: I1012 21:40:04.513396 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-content/0.log" Oct 12 21:40:04 crc kubenswrapper[4773]: I1012 21:40:04.779869 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/registry-server/0.log" Oct 12 21:40:13 crc kubenswrapper[4773]: I1012 21:40:13.480942 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:40:13 crc kubenswrapper[4773]: E1012 21:40:13.481760 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:40:24 crc kubenswrapper[4773]: E1012 21:40:24.249171 4773 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.202:57390->38.102.83.202:45317: read tcp 38.102.83.202:57390->38.102.83.202:45317: read: connection reset by peer Oct 12 21:40:24 crc kubenswrapper[4773]: I1012 21:40:24.481816 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:40:24 crc kubenswrapper[4773]: E1012 21:40:24.482403 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:40:38 crc kubenswrapper[4773]: I1012 21:40:38.481479 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:40:38 crc kubenswrapper[4773]: E1012 21:40:38.482208 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:40:50 crc kubenswrapper[4773]: I1012 21:40:50.481301 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:40:50 crc kubenswrapper[4773]: E1012 21:40:50.481939 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:41:02 crc kubenswrapper[4773]: I1012 21:41:02.494800 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:41:02 crc kubenswrapper[4773]: E1012 21:41:02.495914 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:41:15 crc kubenswrapper[4773]: I1012 21:41:15.482059 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:41:15 crc kubenswrapper[4773]: E1012 21:41:15.482766 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:41:27 crc kubenswrapper[4773]: I1012 21:41:27.480572 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:41:27 crc kubenswrapper[4773]: E1012 21:41:27.481120 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:41:39 crc kubenswrapper[4773]: I1012 21:41:39.480946 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:41:39 crc kubenswrapper[4773]: E1012 21:41:39.482808 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:41:52 crc kubenswrapper[4773]: I1012 21:41:52.490212 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:41:52 crc kubenswrapper[4773]: E1012 21:41:52.490960 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:42:05 crc kubenswrapper[4773]: I1012 21:42:05.482325 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:42:06 crc kubenswrapper[4773]: I1012 21:42:06.571572 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"79231d88867177fbbffb4ed0e24f773b0847d2dff7ab975fa4b34e14b03b54f1"} Oct 12 21:42:14 crc kubenswrapper[4773]: I1012 21:42:14.645178 4773 generic.go:334] "Generic (PLEG): container finished" podID="8d978211-35bc-410c-a460-964734ca5359" containerID="a540a1a1060de67ea91fbb3435120fbea9a613094c5044b4deec0ef92db02c44" exitCode=0 Oct 12 21:42:14 crc kubenswrapper[4773]: I1012 21:42:14.645350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" event={"ID":"8d978211-35bc-410c-a460-964734ca5359","Type":"ContainerDied","Data":"a540a1a1060de67ea91fbb3435120fbea9a613094c5044b4deec0ef92db02c44"} Oct 12 21:42:14 crc kubenswrapper[4773]: I1012 21:42:14.647155 4773 scope.go:117] "RemoveContainer" containerID="a540a1a1060de67ea91fbb3435120fbea9a613094c5044b4deec0ef92db02c44" Oct 12 21:42:15 crc kubenswrapper[4773]: I1012 21:42:15.638593 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6fbk8_must-gather-cr6hf_8d978211-35bc-410c-a460-964734ca5359/gather/0.log" Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.160093 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6fbk8/must-gather-cr6hf"] Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.161174 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" podUID="8d978211-35bc-410c-a460-964734ca5359" containerName="copy" containerID="cri-o://adcfa4430e5f1fde13c9b12c6b2477723ddefba8a8c86198a579fead890e29b1" gracePeriod=2 Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.169351 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6fbk8/must-gather-cr6hf"] Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.741489 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6fbk8_must-gather-cr6hf_8d978211-35bc-410c-a460-964734ca5359/copy/0.log" Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.742280 4773 generic.go:334] "Generic (PLEG): container finished" podID="8d978211-35bc-410c-a460-964734ca5359" containerID="adcfa4430e5f1fde13c9b12c6b2477723ddefba8a8c86198a579fead890e29b1" exitCode=143 Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.742323 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dc8f002096000a9cfb34e8c296854e3dfbf89735b15c1b9e40ee906fb4a09aa" Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.757284 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6fbk8_must-gather-cr6hf_8d978211-35bc-410c-a460-964734ca5359/copy/0.log" Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.757815 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.918186 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d978211-35bc-410c-a460-964734ca5359-must-gather-output\") pod \"8d978211-35bc-410c-a460-964734ca5359\" (UID: \"8d978211-35bc-410c-a460-964734ca5359\") " Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.918865 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrs69\" (UniqueName: \"kubernetes.io/projected/8d978211-35bc-410c-a460-964734ca5359-kube-api-access-lrs69\") pod \"8d978211-35bc-410c-a460-964734ca5359\" (UID: \"8d978211-35bc-410c-a460-964734ca5359\") " Oct 12 21:42:24 crc kubenswrapper[4773]: I1012 21:42:24.932575 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d978211-35bc-410c-a460-964734ca5359-kube-api-access-lrs69" (OuterVolumeSpecName: "kube-api-access-lrs69") pod "8d978211-35bc-410c-a460-964734ca5359" (UID: "8d978211-35bc-410c-a460-964734ca5359"). InnerVolumeSpecName "kube-api-access-lrs69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:42:25 crc kubenswrapper[4773]: I1012 21:42:25.021164 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrs69\" (UniqueName: \"kubernetes.io/projected/8d978211-35bc-410c-a460-964734ca5359-kube-api-access-lrs69\") on node \"crc\" DevicePath \"\"" Oct 12 21:42:25 crc kubenswrapper[4773]: I1012 21:42:25.100399 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d978211-35bc-410c-a460-964734ca5359-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8d978211-35bc-410c-a460-964734ca5359" (UID: "8d978211-35bc-410c-a460-964734ca5359"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:42:25 crc kubenswrapper[4773]: I1012 21:42:25.122499 4773 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d978211-35bc-410c-a460-964734ca5359-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 12 21:42:25 crc kubenswrapper[4773]: I1012 21:42:25.754691 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fbk8/must-gather-cr6hf" Oct 12 21:42:26 crc kubenswrapper[4773]: I1012 21:42:26.495800 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d978211-35bc-410c-a460-964734ca5359" path="/var/lib/kubelet/pods/8d978211-35bc-410c-a460-964734ca5359/volumes" Oct 12 21:42:47 crc kubenswrapper[4773]: I1012 21:42:47.382296 4773 scope.go:117] "RemoveContainer" containerID="a522132968203e9187f689162ccc4a4ae8a4643955edc873065e9d7226cc8e1e" Oct 12 21:42:47 crc kubenswrapper[4773]: I1012 21:42:47.419817 4773 scope.go:117] "RemoveContainer" containerID="a540a1a1060de67ea91fbb3435120fbea9a613094c5044b4deec0ef92db02c44" Oct 12 21:42:47 crc kubenswrapper[4773]: I1012 21:42:47.497039 4773 scope.go:117] "RemoveContainer" containerID="74bf1253aff1ee08b9540c5e5d4eff8bf0488730e94d06bef0f48a21a5e18ea8" Oct 12 21:42:47 crc kubenswrapper[4773]: I1012 21:42:47.526899 4773 scope.go:117] "RemoveContainer" containerID="2d21b2c0bd4cab708fbb4889905bc1676e3a5fa5051148cdacd0d305f3cf67b9" Oct 12 21:42:47 crc kubenswrapper[4773]: I1012 21:42:47.572811 4773 scope.go:117] "RemoveContainer" containerID="adcfa4430e5f1fde13c9b12c6b2477723ddefba8a8c86198a579fead890e29b1" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.353420 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6cmz9/must-gather-jx5dd"] Oct 12 21:43:05 crc kubenswrapper[4773]: E1012 21:43:05.354225 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d978211-35bc-410c-a460-964734ca5359" containerName="gather" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.354236 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d978211-35bc-410c-a460-964734ca5359" containerName="gather" Oct 12 21:43:05 crc kubenswrapper[4773]: E1012 21:43:05.354266 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d978211-35bc-410c-a460-964734ca5359" containerName="copy" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.354271 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d978211-35bc-410c-a460-964734ca5359" containerName="copy" Oct 12 21:43:05 crc kubenswrapper[4773]: E1012 21:43:05.354293 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea407521-51bc-4cab-90be-b39729d077c5" containerName="container-00" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.354299 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea407521-51bc-4cab-90be-b39729d077c5" containerName="container-00" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.354459 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d978211-35bc-410c-a460-964734ca5359" containerName="gather" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.354483 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d978211-35bc-410c-a460-964734ca5359" containerName="copy" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.354493 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea407521-51bc-4cab-90be-b39729d077c5" containerName="container-00" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.355380 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.359459 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6cmz9"/"openshift-service-ca.crt" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.361324 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6cmz9"/"kube-root-ca.crt" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.366000 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74aeb93f-5898-4391-9fdc-555e496fcb91-must-gather-output\") pod \"must-gather-jx5dd\" (UID: \"74aeb93f-5898-4391-9fdc-555e496fcb91\") " pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.366163 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7km9x\" (UniqueName: \"kubernetes.io/projected/74aeb93f-5898-4391-9fdc-555e496fcb91-kube-api-access-7km9x\") pod \"must-gather-jx5dd\" (UID: \"74aeb93f-5898-4391-9fdc-555e496fcb91\") " pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.371793 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6cmz9"/"default-dockercfg-zthfc" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.436517 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6cmz9/must-gather-jx5dd"] Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.468155 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7km9x\" (UniqueName: \"kubernetes.io/projected/74aeb93f-5898-4391-9fdc-555e496fcb91-kube-api-access-7km9x\") pod \"must-gather-jx5dd\" (UID: \"74aeb93f-5898-4391-9fdc-555e496fcb91\") " pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.468280 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74aeb93f-5898-4391-9fdc-555e496fcb91-must-gather-output\") pod \"must-gather-jx5dd\" (UID: \"74aeb93f-5898-4391-9fdc-555e496fcb91\") " pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.468891 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74aeb93f-5898-4391-9fdc-555e496fcb91-must-gather-output\") pod \"must-gather-jx5dd\" (UID: \"74aeb93f-5898-4391-9fdc-555e496fcb91\") " pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.489156 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7km9x\" (UniqueName: \"kubernetes.io/projected/74aeb93f-5898-4391-9fdc-555e496fcb91-kube-api-access-7km9x\") pod \"must-gather-jx5dd\" (UID: \"74aeb93f-5898-4391-9fdc-555e496fcb91\") " pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:43:05 crc kubenswrapper[4773]: I1012 21:43:05.677164 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:43:06 crc kubenswrapper[4773]: I1012 21:43:06.166804 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6cmz9/must-gather-jx5dd"] Oct 12 21:43:07 crc kubenswrapper[4773]: I1012 21:43:07.144138 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" event={"ID":"74aeb93f-5898-4391-9fdc-555e496fcb91","Type":"ContainerStarted","Data":"de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af"} Oct 12 21:43:07 crc kubenswrapper[4773]: I1012 21:43:07.145699 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" event={"ID":"74aeb93f-5898-4391-9fdc-555e496fcb91","Type":"ContainerStarted","Data":"b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0"} Oct 12 21:43:07 crc kubenswrapper[4773]: I1012 21:43:07.145812 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" event={"ID":"74aeb93f-5898-4391-9fdc-555e496fcb91","Type":"ContainerStarted","Data":"2d327167026b63f6c02dadc38bbebeefe3d6a1535579733a101d6457d3b9266e"} Oct 12 21:43:07 crc kubenswrapper[4773]: I1012 21:43:07.164753 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" podStartSLOduration=2.164737123 podStartE2EDuration="2.164737123s" podCreationTimestamp="2025-10-12 21:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:43:07.164096385 +0000 UTC m=+4735.400394945" watchObservedRunningTime="2025-10-12 21:43:07.164737123 +0000 UTC m=+4735.401035683" Oct 12 21:43:10 crc kubenswrapper[4773]: I1012 21:43:10.917611 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6cmz9/crc-debug-xqzgw"] Oct 12 21:43:10 crc kubenswrapper[4773]: I1012 21:43:10.919399 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.016396 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-66k7k"] Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.018584 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.037877 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66k7k"] Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.109967 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rd7\" (UniqueName: \"kubernetes.io/projected/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-kube-api-access-b4rd7\") pod \"crc-debug-xqzgw\" (UID: \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\") " pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.110346 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-host\") pod \"crc-debug-xqzgw\" (UID: \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\") " pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.212610 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw75r\" (UniqueName: \"kubernetes.io/projected/65ad179c-e87c-4c17-9b77-b5793d7d3899-kube-api-access-vw75r\") pod \"redhat-marketplace-66k7k\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.212953 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-host\") pod \"crc-debug-xqzgw\" (UID: \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\") " pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.213108 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-utilities\") pod \"redhat-marketplace-66k7k\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.213253 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rd7\" (UniqueName: \"kubernetes.io/projected/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-kube-api-access-b4rd7\") pod \"crc-debug-xqzgw\" (UID: \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\") " pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.213393 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-catalog-content\") pod \"redhat-marketplace-66k7k\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.213202 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-host\") pod \"crc-debug-xqzgw\" (UID: \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\") " pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.247740 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rd7\" (UniqueName: \"kubernetes.io/projected/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-kube-api-access-b4rd7\") pod \"crc-debug-xqzgw\" (UID: \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\") " pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.314905 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw75r\" (UniqueName: \"kubernetes.io/projected/65ad179c-e87c-4c17-9b77-b5793d7d3899-kube-api-access-vw75r\") pod \"redhat-marketplace-66k7k\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.315031 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-utilities\") pod \"redhat-marketplace-66k7k\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.315083 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-catalog-content\") pod \"redhat-marketplace-66k7k\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.315527 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-catalog-content\") pod \"redhat-marketplace-66k7k\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.316148 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-utilities\") pod \"redhat-marketplace-66k7k\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.363466 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw75r\" (UniqueName: \"kubernetes.io/projected/65ad179c-e87c-4c17-9b77-b5793d7d3899-kube-api-access-vw75r\") pod \"redhat-marketplace-66k7k\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.536425 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:11 crc kubenswrapper[4773]: W1012 21:43:11.567045 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4301a4b_c944_4710_b8c1_5b8bbb94be9d.slice/crio-95665e70af03fbfbde631e97ff39b5f31886ae8f047b532bcc66254dde6401dc WatchSource:0}: Error finding container 95665e70af03fbfbde631e97ff39b5f31886ae8f047b532bcc66254dde6401dc: Status 404 returned error can't find the container with id 95665e70af03fbfbde631e97ff39b5f31886ae8f047b532bcc66254dde6401dc Oct 12 21:43:11 crc kubenswrapper[4773]: I1012 21:43:11.632872 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:12 crc kubenswrapper[4773]: I1012 21:43:12.183121 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" event={"ID":"f4301a4b-c944-4710-b8c1-5b8bbb94be9d","Type":"ContainerStarted","Data":"9f84a81e4d39cb8d96c39573bb3f0d3ffdf9e412950b65f316f5fed13d348063"} Oct 12 21:43:12 crc kubenswrapper[4773]: I1012 21:43:12.186398 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" event={"ID":"f4301a4b-c944-4710-b8c1-5b8bbb94be9d","Type":"ContainerStarted","Data":"95665e70af03fbfbde631e97ff39b5f31886ae8f047b532bcc66254dde6401dc"} Oct 12 21:43:12 crc kubenswrapper[4773]: W1012 21:43:12.194872 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65ad179c_e87c_4c17_9b77_b5793d7d3899.slice/crio-e32569d8f21d490806d5fa846916844bf20ffa8a790e5f115593c78aeb1eaf27 WatchSource:0}: Error finding container e32569d8f21d490806d5fa846916844bf20ffa8a790e5f115593c78aeb1eaf27: Status 404 returned error can't find the container with id e32569d8f21d490806d5fa846916844bf20ffa8a790e5f115593c78aeb1eaf27 Oct 12 21:43:12 crc kubenswrapper[4773]: I1012 21:43:12.196846 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66k7k"] Oct 12 21:43:12 crc kubenswrapper[4773]: I1012 21:43:12.217169 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" podStartSLOduration=2.217149132 podStartE2EDuration="2.217149132s" podCreationTimestamp="2025-10-12 21:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 21:43:12.207229028 +0000 UTC m=+4740.443527578" watchObservedRunningTime="2025-10-12 21:43:12.217149132 +0000 UTC m=+4740.453447702" Oct 12 21:43:13 crc kubenswrapper[4773]: I1012 21:43:13.192239 4773 generic.go:334] "Generic (PLEG): container finished" podID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerID="739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef" exitCode=0 Oct 12 21:43:13 crc kubenswrapper[4773]: I1012 21:43:13.192350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66k7k" event={"ID":"65ad179c-e87c-4c17-9b77-b5793d7d3899","Type":"ContainerDied","Data":"739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef"} Oct 12 21:43:13 crc kubenswrapper[4773]: I1012 21:43:13.192512 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66k7k" event={"ID":"65ad179c-e87c-4c17-9b77-b5793d7d3899","Type":"ContainerStarted","Data":"e32569d8f21d490806d5fa846916844bf20ffa8a790e5f115593c78aeb1eaf27"} Oct 12 21:43:13 crc kubenswrapper[4773]: I1012 21:43:13.194546 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 21:43:14 crc kubenswrapper[4773]: I1012 21:43:14.202090 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66k7k" event={"ID":"65ad179c-e87c-4c17-9b77-b5793d7d3899","Type":"ContainerStarted","Data":"f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128"} Oct 12 21:43:15 crc kubenswrapper[4773]: I1012 21:43:15.217074 4773 generic.go:334] "Generic (PLEG): container finished" podID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerID="f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128" exitCode=0 Oct 12 21:43:15 crc kubenswrapper[4773]: I1012 21:43:15.217296 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66k7k" event={"ID":"65ad179c-e87c-4c17-9b77-b5793d7d3899","Type":"ContainerDied","Data":"f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128"} Oct 12 21:43:16 crc kubenswrapper[4773]: I1012 21:43:16.240702 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66k7k" event={"ID":"65ad179c-e87c-4c17-9b77-b5793d7d3899","Type":"ContainerStarted","Data":"5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875"} Oct 12 21:43:16 crc kubenswrapper[4773]: I1012 21:43:16.267182 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-66k7k" podStartSLOduration=3.802154034 podStartE2EDuration="6.267161651s" podCreationTimestamp="2025-10-12 21:43:10 +0000 UTC" firstStartedPulling="2025-10-12 21:43:13.194252203 +0000 UTC m=+4741.430550763" lastFinishedPulling="2025-10-12 21:43:15.65925982 +0000 UTC m=+4743.895558380" observedRunningTime="2025-10-12 21:43:16.256742623 +0000 UTC m=+4744.493041193" watchObservedRunningTime="2025-10-12 21:43:16.267161651 +0000 UTC m=+4744.503460231" Oct 12 21:43:21 crc kubenswrapper[4773]: I1012 21:43:21.633602 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:21 crc kubenswrapper[4773]: I1012 21:43:21.635103 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:21 crc kubenswrapper[4773]: I1012 21:43:21.683249 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:22 crc kubenswrapper[4773]: I1012 21:43:22.453910 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:22 crc kubenswrapper[4773]: I1012 21:43:22.525840 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66k7k"] Oct 12 21:43:24 crc kubenswrapper[4773]: I1012 21:43:24.342993 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-66k7k" podUID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerName="registry-server" containerID="cri-o://5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875" gracePeriod=2 Oct 12 21:43:24 crc kubenswrapper[4773]: I1012 21:43:24.831884 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:24 crc kubenswrapper[4773]: I1012 21:43:24.983114 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-catalog-content\") pod \"65ad179c-e87c-4c17-9b77-b5793d7d3899\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " Oct 12 21:43:24 crc kubenswrapper[4773]: I1012 21:43:24.983527 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw75r\" (UniqueName: \"kubernetes.io/projected/65ad179c-e87c-4c17-9b77-b5793d7d3899-kube-api-access-vw75r\") pod \"65ad179c-e87c-4c17-9b77-b5793d7d3899\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " Oct 12 21:43:24 crc kubenswrapper[4773]: I1012 21:43:24.983659 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-utilities\") pod \"65ad179c-e87c-4c17-9b77-b5793d7d3899\" (UID: \"65ad179c-e87c-4c17-9b77-b5793d7d3899\") " Oct 12 21:43:24 crc kubenswrapper[4773]: I1012 21:43:24.984317 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-utilities" (OuterVolumeSpecName: "utilities") pod "65ad179c-e87c-4c17-9b77-b5793d7d3899" (UID: "65ad179c-e87c-4c17-9b77-b5793d7d3899"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:43:24 crc kubenswrapper[4773]: I1012 21:43:24.998955 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ad179c-e87c-4c17-9b77-b5793d7d3899-kube-api-access-vw75r" (OuterVolumeSpecName: "kube-api-access-vw75r") pod "65ad179c-e87c-4c17-9b77-b5793d7d3899" (UID: "65ad179c-e87c-4c17-9b77-b5793d7d3899"). InnerVolumeSpecName "kube-api-access-vw75r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:43:24 crc kubenswrapper[4773]: I1012 21:43:24.999819 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65ad179c-e87c-4c17-9b77-b5793d7d3899" (UID: "65ad179c-e87c-4c17-9b77-b5793d7d3899"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.085477 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.085504 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw75r\" (UniqueName: \"kubernetes.io/projected/65ad179c-e87c-4c17-9b77-b5793d7d3899-kube-api-access-vw75r\") on node \"crc\" DevicePath \"\"" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.085515 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ad179c-e87c-4c17-9b77-b5793d7d3899-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.354461 4773 generic.go:334] "Generic (PLEG): container finished" podID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerID="5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875" exitCode=0 Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.354499 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66k7k" event={"ID":"65ad179c-e87c-4c17-9b77-b5793d7d3899","Type":"ContainerDied","Data":"5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875"} Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.354519 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66k7k" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.354531 4773 scope.go:117] "RemoveContainer" containerID="5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.354522 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66k7k" event={"ID":"65ad179c-e87c-4c17-9b77-b5793d7d3899","Type":"ContainerDied","Data":"e32569d8f21d490806d5fa846916844bf20ffa8a790e5f115593c78aeb1eaf27"} Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.375760 4773 scope.go:117] "RemoveContainer" containerID="f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.399513 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66k7k"] Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.416434 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-66k7k"] Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.432190 4773 scope.go:117] "RemoveContainer" containerID="739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.463834 4773 scope.go:117] "RemoveContainer" containerID="5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875" Oct 12 21:43:25 crc kubenswrapper[4773]: E1012 21:43:25.464951 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875\": container with ID starting with 5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875 not found: ID does not exist" containerID="5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.464990 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875"} err="failed to get container status \"5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875\": rpc error: code = NotFound desc = could not find container \"5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875\": container with ID starting with 5496794f077ec70fa590276ae3987291de76fd53bb6bd9f90f5ecb2d01da3875 not found: ID does not exist" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.465015 4773 scope.go:117] "RemoveContainer" containerID="f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128" Oct 12 21:43:25 crc kubenswrapper[4773]: E1012 21:43:25.467122 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128\": container with ID starting with f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128 not found: ID does not exist" containerID="f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.467155 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128"} err="failed to get container status \"f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128\": rpc error: code = NotFound desc = could not find container \"f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128\": container with ID starting with f9d0b0a2bef21ccd2fbd0d5cbd0d6b3b6352092631d2ac63affa1583462da128 not found: ID does not exist" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.467175 4773 scope.go:117] "RemoveContainer" containerID="739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef" Oct 12 21:43:25 crc kubenswrapper[4773]: E1012 21:43:25.467432 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef\": container with ID starting with 739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef not found: ID does not exist" containerID="739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef" Oct 12 21:43:25 crc kubenswrapper[4773]: I1012 21:43:25.467459 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef"} err="failed to get container status \"739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef\": rpc error: code = NotFound desc = could not find container \"739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef\": container with ID starting with 739da2c446856632c60acd4d4366eeef2fe6253caad5f718869fa08a981d13ef not found: ID does not exist" Oct 12 21:43:26 crc kubenswrapper[4773]: I1012 21:43:26.497086 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ad179c-e87c-4c17-9b77-b5793d7d3899" path="/var/lib/kubelet/pods/65ad179c-e87c-4c17-9b77-b5793d7d3899/volumes" Oct 12 21:43:45 crc kubenswrapper[4773]: I1012 21:43:45.530466 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4301a4b-c944-4710-b8c1-5b8bbb94be9d" containerID="9f84a81e4d39cb8d96c39573bb3f0d3ffdf9e412950b65f316f5fed13d348063" exitCode=0 Oct 12 21:43:45 crc kubenswrapper[4773]: I1012 21:43:45.530634 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" event={"ID":"f4301a4b-c944-4710-b8c1-5b8bbb94be9d","Type":"ContainerDied","Data":"9f84a81e4d39cb8d96c39573bb3f0d3ffdf9e412950b65f316f5fed13d348063"} Oct 12 21:43:46 crc kubenswrapper[4773]: I1012 21:43:46.834155 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:46 crc kubenswrapper[4773]: I1012 21:43:46.871752 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6cmz9/crc-debug-xqzgw"] Oct 12 21:43:46 crc kubenswrapper[4773]: I1012 21:43:46.915818 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6cmz9/crc-debug-xqzgw"] Oct 12 21:43:46 crc kubenswrapper[4773]: I1012 21:43:46.924583 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-host\") pod \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\" (UID: \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\") " Oct 12 21:43:46 crc kubenswrapper[4773]: I1012 21:43:46.924857 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4rd7\" (UniqueName: \"kubernetes.io/projected/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-kube-api-access-b4rd7\") pod \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\" (UID: \"f4301a4b-c944-4710-b8c1-5b8bbb94be9d\") " Oct 12 21:43:46 crc kubenswrapper[4773]: I1012 21:43:46.925351 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-host" (OuterVolumeSpecName: "host") pod "f4301a4b-c944-4710-b8c1-5b8bbb94be9d" (UID: "f4301a4b-c944-4710-b8c1-5b8bbb94be9d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:43:46 crc kubenswrapper[4773]: I1012 21:43:46.932131 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-kube-api-access-b4rd7" (OuterVolumeSpecName: "kube-api-access-b4rd7") pod "f4301a4b-c944-4710-b8c1-5b8bbb94be9d" (UID: "f4301a4b-c944-4710-b8c1-5b8bbb94be9d"). InnerVolumeSpecName "kube-api-access-b4rd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:43:47 crc kubenswrapper[4773]: I1012 21:43:47.026785 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4rd7\" (UniqueName: \"kubernetes.io/projected/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-kube-api-access-b4rd7\") on node \"crc\" DevicePath \"\"" Oct 12 21:43:47 crc kubenswrapper[4773]: I1012 21:43:47.027084 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4301a4b-c944-4710-b8c1-5b8bbb94be9d-host\") on node \"crc\" DevicePath \"\"" Oct 12 21:43:47 crc kubenswrapper[4773]: I1012 21:43:47.551436 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95665e70af03fbfbde631e97ff39b5f31886ae8f047b532bcc66254dde6401dc" Oct 12 21:43:47 crc kubenswrapper[4773]: I1012 21:43:47.551524 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-xqzgw" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.127081 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6cmz9/crc-debug-b9c9h"] Oct 12 21:43:48 crc kubenswrapper[4773]: E1012 21:43:48.127447 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4301a4b-c944-4710-b8c1-5b8bbb94be9d" containerName="container-00" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.127460 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4301a4b-c944-4710-b8c1-5b8bbb94be9d" containerName="container-00" Oct 12 21:43:48 crc kubenswrapper[4773]: E1012 21:43:48.127481 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerName="extract-content" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.127487 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerName="extract-content" Oct 12 21:43:48 crc kubenswrapper[4773]: E1012 21:43:48.127503 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerName="extract-utilities" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.127509 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerName="extract-utilities" Oct 12 21:43:48 crc kubenswrapper[4773]: E1012 21:43:48.127528 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerName="registry-server" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.127533 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerName="registry-server" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.127732 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ad179c-e87c-4c17-9b77-b5793d7d3899" containerName="registry-server" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.127741 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4301a4b-c944-4710-b8c1-5b8bbb94be9d" containerName="container-00" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.128305 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.246993 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvl9r\" (UniqueName: \"kubernetes.io/projected/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-kube-api-access-vvl9r\") pod \"crc-debug-b9c9h\" (UID: \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\") " pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.247417 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-host\") pod \"crc-debug-b9c9h\" (UID: \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\") " pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.349549 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvl9r\" (UniqueName: \"kubernetes.io/projected/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-kube-api-access-vvl9r\") pod \"crc-debug-b9c9h\" (UID: \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\") " pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.349615 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-host\") pod \"crc-debug-b9c9h\" (UID: \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\") " pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.349817 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-host\") pod \"crc-debug-b9c9h\" (UID: \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\") " pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.366163 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvl9r\" (UniqueName: \"kubernetes.io/projected/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-kube-api-access-vvl9r\") pod \"crc-debug-b9c9h\" (UID: \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\") " pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.442998 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.490935 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4301a4b-c944-4710-b8c1-5b8bbb94be9d" path="/var/lib/kubelet/pods/f4301a4b-c944-4710-b8c1-5b8bbb94be9d/volumes" Oct 12 21:43:48 crc kubenswrapper[4773]: I1012 21:43:48.560015 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" event={"ID":"d7c69057-f16c-4058-a5a5-6fdfc174e7ba","Type":"ContainerStarted","Data":"21b549aecc6c092c71f9edbf41335465795c38fb779b3fec009738178a9c6117"} Oct 12 21:43:49 crc kubenswrapper[4773]: I1012 21:43:49.569229 4773 generic.go:334] "Generic (PLEG): container finished" podID="d7c69057-f16c-4058-a5a5-6fdfc174e7ba" containerID="db62d0d2356666e2e4f1106c462d0a559edb3547fa00eca23e62ac5b3d5909f1" exitCode=0 Oct 12 21:43:49 crc kubenswrapper[4773]: I1012 21:43:49.569273 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" event={"ID":"d7c69057-f16c-4058-a5a5-6fdfc174e7ba","Type":"ContainerDied","Data":"db62d0d2356666e2e4f1106c462d0a559edb3547fa00eca23e62ac5b3d5909f1"} Oct 12 21:43:49 crc kubenswrapper[4773]: I1012 21:43:49.944936 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6cmz9/crc-debug-b9c9h"] Oct 12 21:43:49 crc kubenswrapper[4773]: I1012 21:43:49.955185 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6cmz9/crc-debug-b9c9h"] Oct 12 21:43:50 crc kubenswrapper[4773]: I1012 21:43:50.673004 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:50 crc kubenswrapper[4773]: I1012 21:43:50.793887 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvl9r\" (UniqueName: \"kubernetes.io/projected/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-kube-api-access-vvl9r\") pod \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\" (UID: \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\") " Oct 12 21:43:50 crc kubenswrapper[4773]: I1012 21:43:50.794053 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-host\") pod \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\" (UID: \"d7c69057-f16c-4058-a5a5-6fdfc174e7ba\") " Oct 12 21:43:50 crc kubenswrapper[4773]: I1012 21:43:50.794495 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-host" (OuterVolumeSpecName: "host") pod "d7c69057-f16c-4058-a5a5-6fdfc174e7ba" (UID: "d7c69057-f16c-4058-a5a5-6fdfc174e7ba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:43:50 crc kubenswrapper[4773]: I1012 21:43:50.811947 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-kube-api-access-vvl9r" (OuterVolumeSpecName: "kube-api-access-vvl9r") pod "d7c69057-f16c-4058-a5a5-6fdfc174e7ba" (UID: "d7c69057-f16c-4058-a5a5-6fdfc174e7ba"). InnerVolumeSpecName "kube-api-access-vvl9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:43:50 crc kubenswrapper[4773]: I1012 21:43:50.895676 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvl9r\" (UniqueName: \"kubernetes.io/projected/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-kube-api-access-vvl9r\") on node \"crc\" DevicePath \"\"" Oct 12 21:43:50 crc kubenswrapper[4773]: I1012 21:43:50.895778 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d7c69057-f16c-4058-a5a5-6fdfc174e7ba-host\") on node \"crc\" DevicePath \"\"" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.191651 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6cmz9/crc-debug-j44r9"] Oct 12 21:43:51 crc kubenswrapper[4773]: E1012 21:43:51.192030 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c69057-f16c-4058-a5a5-6fdfc174e7ba" containerName="container-00" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.192048 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c69057-f16c-4058-a5a5-6fdfc174e7ba" containerName="container-00" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.192216 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c69057-f16c-4058-a5a5-6fdfc174e7ba" containerName="container-00" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.192800 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.301903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwfq2\" (UniqueName: \"kubernetes.io/projected/e761a85d-9644-44c9-8473-324b7cbfc54e-kube-api-access-vwfq2\") pod \"crc-debug-j44r9\" (UID: \"e761a85d-9644-44c9-8473-324b7cbfc54e\") " pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.302302 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e761a85d-9644-44c9-8473-324b7cbfc54e-host\") pod \"crc-debug-j44r9\" (UID: \"e761a85d-9644-44c9-8473-324b7cbfc54e\") " pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.404576 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwfq2\" (UniqueName: \"kubernetes.io/projected/e761a85d-9644-44c9-8473-324b7cbfc54e-kube-api-access-vwfq2\") pod \"crc-debug-j44r9\" (UID: \"e761a85d-9644-44c9-8473-324b7cbfc54e\") " pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.404740 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e761a85d-9644-44c9-8473-324b7cbfc54e-host\") pod \"crc-debug-j44r9\" (UID: \"e761a85d-9644-44c9-8473-324b7cbfc54e\") " pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.404914 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e761a85d-9644-44c9-8473-324b7cbfc54e-host\") pod \"crc-debug-j44r9\" (UID: \"e761a85d-9644-44c9-8473-324b7cbfc54e\") " pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.424680 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwfq2\" (UniqueName: \"kubernetes.io/projected/e761a85d-9644-44c9-8473-324b7cbfc54e-kube-api-access-vwfq2\") pod \"crc-debug-j44r9\" (UID: \"e761a85d-9644-44c9-8473-324b7cbfc54e\") " pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.506376 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:43:51 crc kubenswrapper[4773]: W1012 21:43:51.548414 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode761a85d_9644_44c9_8473_324b7cbfc54e.slice/crio-8f3ea7122c11f9d9ab713bb6dabd8034651fd22fe27e02ca9855b8133d804f90 WatchSource:0}: Error finding container 8f3ea7122c11f9d9ab713bb6dabd8034651fd22fe27e02ca9855b8133d804f90: Status 404 returned error can't find the container with id 8f3ea7122c11f9d9ab713bb6dabd8034651fd22fe27e02ca9855b8133d804f90 Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.604573 4773 scope.go:117] "RemoveContainer" containerID="db62d0d2356666e2e4f1106c462d0a559edb3547fa00eca23e62ac5b3d5909f1" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.604690 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-b9c9h" Oct 12 21:43:51 crc kubenswrapper[4773]: I1012 21:43:51.619787 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/crc-debug-j44r9" event={"ID":"e761a85d-9644-44c9-8473-324b7cbfc54e","Type":"ContainerStarted","Data":"8f3ea7122c11f9d9ab713bb6dabd8034651fd22fe27e02ca9855b8133d804f90"} Oct 12 21:43:52 crc kubenswrapper[4773]: I1012 21:43:52.491020 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c69057-f16c-4058-a5a5-6fdfc174e7ba" path="/var/lib/kubelet/pods/d7c69057-f16c-4058-a5a5-6fdfc174e7ba/volumes" Oct 12 21:43:52 crc kubenswrapper[4773]: I1012 21:43:52.628663 4773 generic.go:334] "Generic (PLEG): container finished" podID="e761a85d-9644-44c9-8473-324b7cbfc54e" containerID="6fb66144016b8d25d89e90c442c6ea05f056bc56a5270234858d3a7534ae8ac0" exitCode=0 Oct 12 21:43:52 crc kubenswrapper[4773]: I1012 21:43:52.628736 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/crc-debug-j44r9" event={"ID":"e761a85d-9644-44c9-8473-324b7cbfc54e","Type":"ContainerDied","Data":"6fb66144016b8d25d89e90c442c6ea05f056bc56a5270234858d3a7534ae8ac0"} Oct 12 21:43:52 crc kubenswrapper[4773]: I1012 21:43:52.709707 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6cmz9/crc-debug-j44r9"] Oct 12 21:43:52 crc kubenswrapper[4773]: I1012 21:43:52.720935 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6cmz9/crc-debug-j44r9"] Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.223133 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.255942 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e761a85d-9644-44c9-8473-324b7cbfc54e-host\") pod \"e761a85d-9644-44c9-8473-324b7cbfc54e\" (UID: \"e761a85d-9644-44c9-8473-324b7cbfc54e\") " Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.256045 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e761a85d-9644-44c9-8473-324b7cbfc54e-host" (OuterVolumeSpecName: "host") pod "e761a85d-9644-44c9-8473-324b7cbfc54e" (UID: "e761a85d-9644-44c9-8473-324b7cbfc54e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.256087 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwfq2\" (UniqueName: \"kubernetes.io/projected/e761a85d-9644-44c9-8473-324b7cbfc54e-kube-api-access-vwfq2\") pod \"e761a85d-9644-44c9-8473-324b7cbfc54e\" (UID: \"e761a85d-9644-44c9-8473-324b7cbfc54e\") " Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.256396 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e761a85d-9644-44c9-8473-324b7cbfc54e-host\") on node \"crc\" DevicePath \"\"" Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.263943 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e761a85d-9644-44c9-8473-324b7cbfc54e-kube-api-access-vwfq2" (OuterVolumeSpecName: "kube-api-access-vwfq2") pod "e761a85d-9644-44c9-8473-324b7cbfc54e" (UID: "e761a85d-9644-44c9-8473-324b7cbfc54e"). InnerVolumeSpecName "kube-api-access-vwfq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.358331 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwfq2\" (UniqueName: \"kubernetes.io/projected/e761a85d-9644-44c9-8473-324b7cbfc54e-kube-api-access-vwfq2\") on node \"crc\" DevicePath \"\"" Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.491999 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e761a85d-9644-44c9-8473-324b7cbfc54e" path="/var/lib/kubelet/pods/e761a85d-9644-44c9-8473-324b7cbfc54e/volumes" Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.648688 4773 scope.go:117] "RemoveContainer" containerID="6fb66144016b8d25d89e90c442c6ea05f056bc56a5270234858d3a7534ae8ac0" Oct 12 21:43:54 crc kubenswrapper[4773]: I1012 21:43:54.648826 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/crc-debug-j44r9" Oct 12 21:44:28 crc kubenswrapper[4773]: I1012 21:44:28.669634 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:44:28 crc kubenswrapper[4773]: I1012 21:44:28.670358 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:44:44 crc kubenswrapper[4773]: I1012 21:44:44.554038 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b76446c56-rrfn7_44baa955-b25d-4648-aef5-423ad5992301/barbican-api/0.log" Oct 12 21:44:44 crc kubenswrapper[4773]: I1012 21:44:44.727457 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b76446c56-rrfn7_44baa955-b25d-4648-aef5-423ad5992301/barbican-api-log/0.log" Oct 12 21:44:44 crc kubenswrapper[4773]: I1012 21:44:44.890009 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b8cc45894-xq876_1fa331c5-06f0-4fac-b997-c68390b26f62/barbican-keystone-listener/0.log" Oct 12 21:44:45 crc kubenswrapper[4773]: I1012 21:44:45.007897 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b8cc45894-xq876_1fa331c5-06f0-4fac-b997-c68390b26f62/barbican-keystone-listener-log/0.log" Oct 12 21:44:45 crc kubenswrapper[4773]: I1012 21:44:45.348988 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5bccc98b47-7pq24_bccbf811-29d3-4a21-856b-4ae1cfb29c74/barbican-worker/0.log" Oct 12 21:44:45 crc kubenswrapper[4773]: I1012 21:44:45.446168 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5bccc98b47-7pq24_bccbf811-29d3-4a21-856b-4ae1cfb29c74/barbican-worker-log/0.log" Oct 12 21:44:45 crc kubenswrapper[4773]: I1012 21:44:45.641342 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-qt9v2_25b3f977-6673-4aa8-aadc-89d98ceb7638/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:45 crc kubenswrapper[4773]: I1012 21:44:45.846904 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d/ceilometer-central-agent/0.log" Oct 12 21:44:46 crc kubenswrapper[4773]: I1012 21:44:46.020300 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d/ceilometer-notification-agent/0.log" Oct 12 21:44:46 crc kubenswrapper[4773]: I1012 21:44:46.057571 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d/proxy-httpd/0.log" Oct 12 21:44:46 crc kubenswrapper[4773]: I1012 21:44:46.202078 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9f9d8a89-ad27-4cb1-9bbf-62fd3a9e0a9d/sg-core/0.log" Oct 12 21:44:46 crc kubenswrapper[4773]: I1012 21:44:46.355758 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-fnbwd_f7d6457c-5706-4a38-b0ef-24cc906b7cab/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:46 crc kubenswrapper[4773]: I1012 21:44:46.487908 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-rtr8q_9f4ee59f-a6f0-49ff-a38c-afa0df44a0c2/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:46 crc kubenswrapper[4773]: I1012 21:44:46.681388 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_47b1d281-5528-455b-8b30-b636772d29ce/cinder-api/0.log" Oct 12 21:44:46 crc kubenswrapper[4773]: I1012 21:44:46.697315 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_47b1d281-5528-455b-8b30-b636772d29ce/cinder-api-log/0.log" Oct 12 21:44:46 crc kubenswrapper[4773]: I1012 21:44:46.918473 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_716b9576-d48b-4720-9fb4-73f6744adee5/probe/0.log" Oct 12 21:44:47 crc kubenswrapper[4773]: I1012 21:44:47.067469 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_716b9576-d48b-4720-9fb4-73f6744adee5/cinder-backup/0.log" Oct 12 21:44:47 crc kubenswrapper[4773]: I1012 21:44:47.182634 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_32308982-4e4e-4ca5-98d8-b173e22fa341/cinder-scheduler/0.log" Oct 12 21:44:47 crc kubenswrapper[4773]: I1012 21:44:47.286963 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_32308982-4e4e-4ca5-98d8-b173e22fa341/probe/0.log" Oct 12 21:44:47 crc kubenswrapper[4773]: I1012 21:44:47.399085 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c3963233-e9ff-4f92-a94a-5b99835ab607/probe/0.log" Oct 12 21:44:47 crc kubenswrapper[4773]: I1012 21:44:47.448369 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c3963233-e9ff-4f92-a94a-5b99835ab607/cinder-volume/0.log" Oct 12 21:44:47 crc kubenswrapper[4773]: I1012 21:44:47.580491 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xzgst_dbdc8cc6-2430-493d-8fa1-bdc3bc9fa617/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:47 crc kubenswrapper[4773]: I1012 21:44:47.652706 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-f77w9_17284681-e0c1-42f8-8ee2-2b3f8e73e6d1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:47 crc kubenswrapper[4773]: I1012 21:44:47.823987 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7887c4559f-fs5qk_6875c763-6837-4c47-8738-b66b6d4e6306/init/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.090268 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7887c4559f-fs5qk_6875c763-6837-4c47-8738-b66b6d4e6306/init/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.145454 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_6225801c-d77f-493c-834a-1393a8a1d239/glance-httpd/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.228379 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7887c4559f-fs5qk_6875c763-6837-4c47-8738-b66b6d4e6306/dnsmasq-dns/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.250333 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_6225801c-d77f-493c-834a-1393a8a1d239/glance-log/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.345309 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1e607bc3-d77b-4dfb-a697-911f6dea3244/glance-httpd/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.400887 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1e607bc3-d77b-4dfb-a697-911f6dea3244/glance-log/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.628225 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f5486cbb4-g66c2_7f878004-f437-4db3-a695-09d92a0bc6e4/horizon/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.757777 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f5486cbb4-g66c2_7f878004-f437-4db3-a695-09d92a0bc6e4/horizon-log/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.799271 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ngbkk_23eb0d3e-06b9-4b1e-b493-27d00d4f34f4/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:48 crc kubenswrapper[4773]: I1012 21:44:48.963931 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-fk9wz_4d530e38-f79d-4b93-9d2a-ad94eddb69b1/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:49 crc kubenswrapper[4773]: I1012 21:44:49.206840 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d9b9d6b96-hvhdj_addfad9c-82e3-4f44-883e-c88e44a3641d/keystone-api/0.log" Oct 12 21:44:49 crc kubenswrapper[4773]: I1012 21:44:49.224870 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29338381-pxljd_d2812224-3ef8-431f-896d-01d9d78c3650/keystone-cron/0.log" Oct 12 21:44:49 crc kubenswrapper[4773]: I1012 21:44:49.374434 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5c04ed55-a588-4a57-9f14-90fca8e2dab0/kube-state-metrics/0.log" Oct 12 21:44:49 crc kubenswrapper[4773]: I1012 21:44:49.484151 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6rf7v_e6ba6b5a-7d13-4280-8584-a51dd0a6ab0c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:49 crc kubenswrapper[4773]: I1012 21:44:49.610174 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_06283d24-b053-4893-9dab-4bfe5daf18b1/manila-api-log/0.log" Oct 12 21:44:49 crc kubenswrapper[4773]: I1012 21:44:49.798647 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_06283d24-b053-4893-9dab-4bfe5daf18b1/manila-api/0.log" Oct 12 21:44:49 crc kubenswrapper[4773]: I1012 21:44:49.852863 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_dfbdaab1-e327-4291-a585-829aa6b81f00/probe/0.log" Oct 12 21:44:49 crc kubenswrapper[4773]: I1012 21:44:49.877951 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_dfbdaab1-e327-4291-a585-829aa6b81f00/manila-scheduler/0.log" Oct 12 21:44:50 crc kubenswrapper[4773]: I1012 21:44:50.060247 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_5db237cf-3d2a-48e1-bf07-a92ae2d96139/manila-share/0.log" Oct 12 21:44:50 crc kubenswrapper[4773]: I1012 21:44:50.104671 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_5db237cf-3d2a-48e1-bf07-a92ae2d96139/probe/0.log" Oct 12 21:44:50 crc kubenswrapper[4773]: I1012 21:44:50.447542 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7857d9f9fc-69hqj_86be19ef-4d97-4e17-bfbc-3c9c8153cd76/neutron-api/0.log" Oct 12 21:44:50 crc kubenswrapper[4773]: I1012 21:44:50.486600 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7857d9f9fc-69hqj_86be19ef-4d97-4e17-bfbc-3c9c8153cd76/neutron-httpd/0.log" Oct 12 21:44:50 crc kubenswrapper[4773]: I1012 21:44:50.731941 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5n5jb_0f130afc-51e5-494f-b915-7ec573c760b1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:51 crc kubenswrapper[4773]: I1012 21:44:51.435167 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8163c65f-b48b-4fd5-b7c1-12d94abfa723/nova-api-log/0.log" Oct 12 21:44:51 crc kubenswrapper[4773]: I1012 21:44:51.498967 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_6698b6b5-2a0c-45d1-a3dc-ea58147105dc/nova-cell0-conductor-conductor/0.log" Oct 12 21:44:51 crc kubenswrapper[4773]: I1012 21:44:51.859382 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5838c3b9-38bc-4a97-bcbc-3a734b6b230f/nova-cell1-conductor-conductor/0.log" Oct 12 21:44:51 crc kubenswrapper[4773]: I1012 21:44:51.960728 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4f9f8baf-3f94-4e6c-b5ec-f9763330a042/nova-cell1-novncproxy-novncproxy/0.log" Oct 12 21:44:52 crc kubenswrapper[4773]: I1012 21:44:52.021223 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8163c65f-b48b-4fd5-b7c1-12d94abfa723/nova-api-api/0.log" Oct 12 21:44:52 crc kubenswrapper[4773]: I1012 21:44:52.217080 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-whrkq_f216a3f9-57a7-4084-b8c1-ed07cd69d4ac/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:52 crc kubenswrapper[4773]: I1012 21:44:52.357568 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_74ec9771-6918-4102-abd1-7b9130f91a4d/nova-metadata-log/0.log" Oct 12 21:44:53 crc kubenswrapper[4773]: I1012 21:44:53.174167 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7566f11c-8e52-4fb6-b1a2-98b388ffefd9/mysql-bootstrap/0.log" Oct 12 21:44:53 crc kubenswrapper[4773]: I1012 21:44:53.269299 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bd4ee762-fd56-496f-860b-89201215948c/nova-scheduler-scheduler/0.log" Oct 12 21:44:53 crc kubenswrapper[4773]: I1012 21:44:53.466286 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7566f11c-8e52-4fb6-b1a2-98b388ffefd9/galera/0.log" Oct 12 21:44:53 crc kubenswrapper[4773]: I1012 21:44:53.466622 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7566f11c-8e52-4fb6-b1a2-98b388ffefd9/mysql-bootstrap/0.log" Oct 12 21:44:53 crc kubenswrapper[4773]: I1012 21:44:53.684019 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0cb52152-8a83-4122-be94-0d2803fd5cc7/mysql-bootstrap/0.log" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.254563 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0cb52152-8a83-4122-be94-0d2803fd5cc7/mysql-bootstrap/0.log" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.280544 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0cb52152-8a83-4122-be94-0d2803fd5cc7/galera/0.log" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.312777 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_74ec9771-6918-4102-abd1-7b9130f91a4d/nova-metadata-metadata/0.log" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.500359 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3028de9d-aaa8-4c46-9cbb-a4ab147bf458/openstackclient/0.log" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.592061 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7kldg_b57600c2-89e9-4db4-a846-48235987e13c/openstack-network-exporter/0.log" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.832486 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wfpwq_55ef70a8-016d-403f-ab02-820088160f9c/ovsdb-server-init/0.log" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.879177 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2v7wk"] Oct 12 21:44:54 crc kubenswrapper[4773]: E1012 21:44:54.879538 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e761a85d-9644-44c9-8473-324b7cbfc54e" containerName="container-00" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.879556 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e761a85d-9644-44c9-8473-324b7cbfc54e" containerName="container-00" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.879754 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e761a85d-9644-44c9-8473-324b7cbfc54e" containerName="container-00" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.881085 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:54 crc kubenswrapper[4773]: I1012 21:44:54.912351 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2v7wk"] Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.013927 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-catalog-content\") pod \"certified-operators-2v7wk\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.014215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-utilities\") pod \"certified-operators-2v7wk\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.014387 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zc9g\" (UniqueName: \"kubernetes.io/projected/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-kube-api-access-6zc9g\") pod \"certified-operators-2v7wk\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.116242 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zc9g\" (UniqueName: \"kubernetes.io/projected/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-kube-api-access-6zc9g\") pod \"certified-operators-2v7wk\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.116409 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-catalog-content\") pod \"certified-operators-2v7wk\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.116438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-utilities\") pod \"certified-operators-2v7wk\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.117077 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-utilities\") pod \"certified-operators-2v7wk\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.117108 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-catalog-content\") pod \"certified-operators-2v7wk\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.153134 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zc9g\" (UniqueName: \"kubernetes.io/projected/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-kube-api-access-6zc9g\") pod \"certified-operators-2v7wk\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.156821 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wfpwq_55ef70a8-016d-403f-ab02-820088160f9c/ovsdb-server/0.log" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.157766 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wfpwq_55ef70a8-016d-403f-ab02-820088160f9c/ovsdb-server-init/0.log" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.201059 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.202054 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wfpwq_55ef70a8-016d-403f-ab02-820088160f9c/ovs-vswitchd/0.log" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.840269 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2v7wk"] Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.870070 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sf74r_1a08bcbe-fa8c-43b2-a4fb-ae2212de940d/ovn-controller/0.log" Oct 12 21:44:55 crc kubenswrapper[4773]: I1012 21:44:55.964899 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dlkwd_62f86eec-8f45-4449-a363-cb195f58abbd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.118180 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4a50ca31-4c77-488f-aed7-aa99e82677f0/openstack-network-exporter/0.log" Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.194644 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4a50ca31-4c77-488f-aed7-aa99e82677f0/ovn-northd/0.log" Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.196352 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v7wk" event={"ID":"2b9d88d7-06ef-4a7e-89d7-67009fdacd29","Type":"ContainerDied","Data":"2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931"} Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.196857 4773 generic.go:334] "Generic (PLEG): container finished" podID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerID="2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931" exitCode=0 Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.200821 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v7wk" event={"ID":"2b9d88d7-06ef-4a7e-89d7-67009fdacd29","Type":"ContainerStarted","Data":"e003fb8b2f9fbd068cf3d85828d28bef27e450a8e0ffd8816f98ccb99925d54d"} Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.296368 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc85890c-cde2-470e-87de-4d69f1682bd0/openstack-network-exporter/0.log" Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.373886 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dc85890c-cde2-470e-87de-4d69f1682bd0/ovsdbserver-nb/0.log" Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.564265 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_21606d72-32b0-4552-ac26-df0425f03cdf/ovsdbserver-sb/0.log" Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.600160 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_21606d72-32b0-4552-ac26-df0425f03cdf/openstack-network-exporter/0.log" Oct 12 21:44:56 crc kubenswrapper[4773]: I1012 21:44:56.885745 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d7c87b9bb-vwlxb_c9a14159-b8fe-40c9-b7ac-6c410c02a0ab/placement-api/0.log" Oct 12 21:44:57 crc kubenswrapper[4773]: I1012 21:44:57.095295 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c6bb2e3-2f0e-499a-b349-07ea3eb7190d/setup-container/0.log" Oct 12 21:44:57 crc kubenswrapper[4773]: I1012 21:44:57.114548 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d7c87b9bb-vwlxb_c9a14159-b8fe-40c9-b7ac-6c410c02a0ab/placement-log/0.log" Oct 12 21:44:57 crc kubenswrapper[4773]: I1012 21:44:57.225093 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v7wk" event={"ID":"2b9d88d7-06ef-4a7e-89d7-67009fdacd29","Type":"ContainerStarted","Data":"3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0"} Oct 12 21:44:57 crc kubenswrapper[4773]: I1012 21:44:57.407203 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c6bb2e3-2f0e-499a-b349-07ea3eb7190d/setup-container/0.log" Oct 12 21:44:57 crc kubenswrapper[4773]: I1012 21:44:57.448070 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0b0fae69-d926-472c-a222-3a98f25a1e14/setup-container/0.log" Oct 12 21:44:57 crc kubenswrapper[4773]: I1012 21:44:57.458086 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c6bb2e3-2f0e-499a-b349-07ea3eb7190d/rabbitmq/0.log" Oct 12 21:44:57 crc kubenswrapper[4773]: I1012 21:44:57.987415 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0b0fae69-d926-472c-a222-3a98f25a1e14/rabbitmq/0.log" Oct 12 21:44:58 crc kubenswrapper[4773]: I1012 21:44:58.008565 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ch565_073e807a-4708-4b50-abf6-f66668e13e8e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:58 crc kubenswrapper[4773]: I1012 21:44:58.122000 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_0b0fae69-d926-472c-a222-3a98f25a1e14/setup-container/0.log" Oct 12 21:44:58 crc kubenswrapper[4773]: I1012 21:44:58.612686 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-b5c4p_d3045d7b-b25d-4036-bea1-0b5f184476eb/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:58 crc kubenswrapper[4773]: I1012 21:44:58.616126 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wptgl_83cb532a-174c-41c0-a271-95a66d439f0c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:44:58 crc kubenswrapper[4773]: I1012 21:44:58.673127 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:44:58 crc kubenswrapper[4773]: I1012 21:44:58.673175 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:44:58 crc kubenswrapper[4773]: I1012 21:44:58.931902 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-sdsnz_8c71487d-25fd-480c-90ca-4ca43f86a247/ssh-known-hosts-edpm-deployment/0.log" Oct 12 21:44:58 crc kubenswrapper[4773]: I1012 21:44:58.939462 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4d6f8e69-6323-4d99-bdd3-7bb8ca4e6345/tempest-tests-tempest-tests-runner/0.log" Oct 12 21:44:59 crc kubenswrapper[4773]: I1012 21:44:59.251132 4773 generic.go:334] "Generic (PLEG): container finished" podID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerID="3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0" exitCode=0 Oct 12 21:44:59 crc kubenswrapper[4773]: I1012 21:44:59.251173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v7wk" event={"ID":"2b9d88d7-06ef-4a7e-89d7-67009fdacd29","Type":"ContainerDied","Data":"3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0"} Oct 12 21:44:59 crc kubenswrapper[4773]: I1012 21:44:59.284487 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b8b33e95-51a4-42fe-a706-fc146ef7ce27/test-operator-logs-container/0.log" Oct 12 21:44:59 crc kubenswrapper[4773]: I1012 21:44:59.297709 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-h5w5j_9ef8a23e-6501-4e90-a51c-0d57cee847af/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.156099 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv"] Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.157817 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.160459 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.161065 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.187775 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv"] Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.239867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ksxh\" (UniqueName: \"kubernetes.io/projected/dc285e1e-8a4a-4a1e-96ff-37de788c9087-kube-api-access-9ksxh\") pod \"collect-profiles-29338425-c2zbv\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.239974 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc285e1e-8a4a-4a1e-96ff-37de788c9087-config-volume\") pod \"collect-profiles-29338425-c2zbv\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.240011 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc285e1e-8a4a-4a1e-96ff-37de788c9087-secret-volume\") pod \"collect-profiles-29338425-c2zbv\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.276947 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v7wk" event={"ID":"2b9d88d7-06ef-4a7e-89d7-67009fdacd29","Type":"ContainerStarted","Data":"2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139"} Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.306858 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2v7wk" podStartSLOduration=2.748254547 podStartE2EDuration="6.306843405s" podCreationTimestamp="2025-10-12 21:44:54 +0000 UTC" firstStartedPulling="2025-10-12 21:44:56.198806052 +0000 UTC m=+4844.435104612" lastFinishedPulling="2025-10-12 21:44:59.75739491 +0000 UTC m=+4847.993693470" observedRunningTime="2025-10-12 21:45:00.299216114 +0000 UTC m=+4848.535514674" watchObservedRunningTime="2025-10-12 21:45:00.306843405 +0000 UTC m=+4848.543141965" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.341467 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksxh\" (UniqueName: \"kubernetes.io/projected/dc285e1e-8a4a-4a1e-96ff-37de788c9087-kube-api-access-9ksxh\") pod \"collect-profiles-29338425-c2zbv\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.341571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc285e1e-8a4a-4a1e-96ff-37de788c9087-config-volume\") pod \"collect-profiles-29338425-c2zbv\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.341606 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc285e1e-8a4a-4a1e-96ff-37de788c9087-secret-volume\") pod \"collect-profiles-29338425-c2zbv\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.357530 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc285e1e-8a4a-4a1e-96ff-37de788c9087-config-volume\") pod \"collect-profiles-29338425-c2zbv\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.360208 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ksxh\" (UniqueName: \"kubernetes.io/projected/dc285e1e-8a4a-4a1e-96ff-37de788c9087-kube-api-access-9ksxh\") pod \"collect-profiles-29338425-c2zbv\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.380766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc285e1e-8a4a-4a1e-96ff-37de788c9087-secret-volume\") pod \"collect-profiles-29338425-c2zbv\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.479634 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:00 crc kubenswrapper[4773]: I1012 21:45:00.989588 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv"] Oct 12 21:45:01 crc kubenswrapper[4773]: W1012 21:45:01.114023 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc285e1e_8a4a_4a1e_96ff_37de788c9087.slice/crio-78bc86f5589a5d88fef4877bebbf0dcc9f8795f3a5fc33e36464b18fca046e94 WatchSource:0}: Error finding container 78bc86f5589a5d88fef4877bebbf0dcc9f8795f3a5fc33e36464b18fca046e94: Status 404 returned error can't find the container with id 78bc86f5589a5d88fef4877bebbf0dcc9f8795f3a5fc33e36464b18fca046e94 Oct 12 21:45:01 crc kubenswrapper[4773]: I1012 21:45:01.302097 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" event={"ID":"dc285e1e-8a4a-4a1e-96ff-37de788c9087","Type":"ContainerStarted","Data":"78bc86f5589a5d88fef4877bebbf0dcc9f8795f3a5fc33e36464b18fca046e94"} Oct 12 21:45:02 crc kubenswrapper[4773]: I1012 21:45:02.310554 4773 generic.go:334] "Generic (PLEG): container finished" podID="dc285e1e-8a4a-4a1e-96ff-37de788c9087" containerID="bf90f2bc14d75686e0e7a83f60c7b8fd752079d7858335087df915120f24b261" exitCode=0 Oct 12 21:45:02 crc kubenswrapper[4773]: I1012 21:45:02.310863 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" event={"ID":"dc285e1e-8a4a-4a1e-96ff-37de788c9087","Type":"ContainerDied","Data":"bf90f2bc14d75686e0e7a83f60c7b8fd752079d7858335087df915120f24b261"} Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.690497 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.731625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ksxh\" (UniqueName: \"kubernetes.io/projected/dc285e1e-8a4a-4a1e-96ff-37de788c9087-kube-api-access-9ksxh\") pod \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.732223 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc285e1e-8a4a-4a1e-96ff-37de788c9087-config-volume\") pod \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.732293 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc285e1e-8a4a-4a1e-96ff-37de788c9087-secret-volume\") pod \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\" (UID: \"dc285e1e-8a4a-4a1e-96ff-37de788c9087\") " Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.733405 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc285e1e-8a4a-4a1e-96ff-37de788c9087-config-volume" (OuterVolumeSpecName: "config-volume") pod "dc285e1e-8a4a-4a1e-96ff-37de788c9087" (UID: "dc285e1e-8a4a-4a1e-96ff-37de788c9087"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.756964 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc285e1e-8a4a-4a1e-96ff-37de788c9087-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dc285e1e-8a4a-4a1e-96ff-37de788c9087" (UID: "dc285e1e-8a4a-4a1e-96ff-37de788c9087"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.758487 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc285e1e-8a4a-4a1e-96ff-37de788c9087-kube-api-access-9ksxh" (OuterVolumeSpecName: "kube-api-access-9ksxh") pod "dc285e1e-8a4a-4a1e-96ff-37de788c9087" (UID: "dc285e1e-8a4a-4a1e-96ff-37de788c9087"). InnerVolumeSpecName "kube-api-access-9ksxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.834208 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc285e1e-8a4a-4a1e-96ff-37de788c9087-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.834246 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc285e1e-8a4a-4a1e-96ff-37de788c9087-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 21:45:03 crc kubenswrapper[4773]: I1012 21:45:03.834258 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ksxh\" (UniqueName: \"kubernetes.io/projected/dc285e1e-8a4a-4a1e-96ff-37de788c9087-kube-api-access-9ksxh\") on node \"crc\" DevicePath \"\"" Oct 12 21:45:04 crc kubenswrapper[4773]: I1012 21:45:04.325544 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" event={"ID":"dc285e1e-8a4a-4a1e-96ff-37de788c9087","Type":"ContainerDied","Data":"78bc86f5589a5d88fef4877bebbf0dcc9f8795f3a5fc33e36464b18fca046e94"} Oct 12 21:45:04 crc kubenswrapper[4773]: I1012 21:45:04.325872 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78bc86f5589a5d88fef4877bebbf0dcc9f8795f3a5fc33e36464b18fca046e94" Oct 12 21:45:04 crc kubenswrapper[4773]: I1012 21:45:04.325647 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29338425-c2zbv" Oct 12 21:45:04 crc kubenswrapper[4773]: I1012 21:45:04.761844 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9"] Oct 12 21:45:04 crc kubenswrapper[4773]: I1012 21:45:04.771200 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29338380-zrtg9"] Oct 12 21:45:05 crc kubenswrapper[4773]: I1012 21:45:05.201939 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:45:05 crc kubenswrapper[4773]: I1012 21:45:05.202128 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:45:05 crc kubenswrapper[4773]: I1012 21:45:05.273807 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:45:05 crc kubenswrapper[4773]: I1012 21:45:05.390733 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:45:05 crc kubenswrapper[4773]: I1012 21:45:05.508609 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2v7wk"] Oct 12 21:45:06 crc kubenswrapper[4773]: I1012 21:45:06.491991 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20fdc591-be6d-43c2-8ea1-d69358551827" path="/var/lib/kubelet/pods/20fdc591-be6d-43c2-8ea1-d69358551827/volumes" Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.346745 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2v7wk" podUID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerName="registry-server" containerID="cri-o://2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139" gracePeriod=2 Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.786272 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.805706 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zc9g\" (UniqueName: \"kubernetes.io/projected/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-kube-api-access-6zc9g\") pod \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.806123 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-catalog-content\") pod \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.806301 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-utilities\") pod \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.808671 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-utilities" (OuterVolumeSpecName: "utilities") pod "2b9d88d7-06ef-4a7e-89d7-67009fdacd29" (UID: "2b9d88d7-06ef-4a7e-89d7-67009fdacd29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.823184 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-kube-api-access-6zc9g" (OuterVolumeSpecName: "kube-api-access-6zc9g") pod "2b9d88d7-06ef-4a7e-89d7-67009fdacd29" (UID: "2b9d88d7-06ef-4a7e-89d7-67009fdacd29"). InnerVolumeSpecName "kube-api-access-6zc9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.908622 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b9d88d7-06ef-4a7e-89d7-67009fdacd29" (UID: "2b9d88d7-06ef-4a7e-89d7-67009fdacd29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.911100 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-catalog-content\") pod \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\" (UID: \"2b9d88d7-06ef-4a7e-89d7-67009fdacd29\") " Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.911789 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.911815 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zc9g\" (UniqueName: \"kubernetes.io/projected/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-kube-api-access-6zc9g\") on node \"crc\" DevicePath \"\"" Oct 12 21:45:07 crc kubenswrapper[4773]: W1012 21:45:07.916170 4773 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2b9d88d7-06ef-4a7e-89d7-67009fdacd29/volumes/kubernetes.io~empty-dir/catalog-content Oct 12 21:45:07 crc kubenswrapper[4773]: I1012 21:45:07.917982 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b9d88d7-06ef-4a7e-89d7-67009fdacd29" (UID: "2b9d88d7-06ef-4a7e-89d7-67009fdacd29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.013589 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9d88d7-06ef-4a7e-89d7-67009fdacd29-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.368057 4773 generic.go:334] "Generic (PLEG): container finished" podID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerID="2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139" exitCode=0 Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.368363 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v7wk" event={"ID":"2b9d88d7-06ef-4a7e-89d7-67009fdacd29","Type":"ContainerDied","Data":"2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139"} Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.368392 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v7wk" event={"ID":"2b9d88d7-06ef-4a7e-89d7-67009fdacd29","Type":"ContainerDied","Data":"e003fb8b2f9fbd068cf3d85828d28bef27e450a8e0ffd8816f98ccb99925d54d"} Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.368408 4773 scope.go:117] "RemoveContainer" containerID="2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.368571 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v7wk" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.412667 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2v7wk"] Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.424110 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2v7wk"] Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.427868 4773 scope.go:117] "RemoveContainer" containerID="3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.463692 4773 scope.go:117] "RemoveContainer" containerID="2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.493147 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" path="/var/lib/kubelet/pods/2b9d88d7-06ef-4a7e-89d7-67009fdacd29/volumes" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.529858 4773 scope.go:117] "RemoveContainer" containerID="2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139" Oct 12 21:45:08 crc kubenswrapper[4773]: E1012 21:45:08.530539 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139\": container with ID starting with 2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139 not found: ID does not exist" containerID="2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.530570 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139"} err="failed to get container status \"2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139\": rpc error: code = NotFound desc = could not find container \"2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139\": container with ID starting with 2e3e1c781eca5ed98e308e82bcd58d9be2b6357ea9813ce559252852ebab3139 not found: ID does not exist" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.530588 4773 scope.go:117] "RemoveContainer" containerID="3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0" Oct 12 21:45:08 crc kubenswrapper[4773]: E1012 21:45:08.533897 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0\": container with ID starting with 3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0 not found: ID does not exist" containerID="3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.533937 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0"} err="failed to get container status \"3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0\": rpc error: code = NotFound desc = could not find container \"3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0\": container with ID starting with 3e2e0e05ba2b80c8d76caf5eb4c655a7697c96f53b321f99483132b35fe1ccd0 not found: ID does not exist" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.533980 4773 scope.go:117] "RemoveContainer" containerID="2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931" Oct 12 21:45:08 crc kubenswrapper[4773]: E1012 21:45:08.535261 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931\": container with ID starting with 2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931 not found: ID does not exist" containerID="2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931" Oct 12 21:45:08 crc kubenswrapper[4773]: I1012 21:45:08.535303 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931"} err="failed to get container status \"2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931\": rpc error: code = NotFound desc = could not find container \"2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931\": container with ID starting with 2f1ea0d1963daf54f97337f6b4a487d072fb653291bcf02a5c6c65a245bc3931 not found: ID does not exist" Oct 12 21:45:15 crc kubenswrapper[4773]: I1012 21:45:15.442318 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8aeb19a9-db56-488b-9410-004f24e8d11a/memcached/0.log" Oct 12 21:45:28 crc kubenswrapper[4773]: I1012 21:45:28.670041 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:45:28 crc kubenswrapper[4773]: I1012 21:45:28.670808 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:45:28 crc kubenswrapper[4773]: I1012 21:45:28.670880 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 21:45:28 crc kubenswrapper[4773]: I1012 21:45:28.671971 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79231d88867177fbbffb4ed0e24f773b0847d2dff7ab975fa4b34e14b03b54f1"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 21:45:28 crc kubenswrapper[4773]: I1012 21:45:28.672075 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://79231d88867177fbbffb4ed0e24f773b0847d2dff7ab975fa4b34e14b03b54f1" gracePeriod=600 Oct 12 21:45:29 crc kubenswrapper[4773]: I1012 21:45:29.571956 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="79231d88867177fbbffb4ed0e24f773b0847d2dff7ab975fa4b34e14b03b54f1" exitCode=0 Oct 12 21:45:29 crc kubenswrapper[4773]: I1012 21:45:29.572031 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"79231d88867177fbbffb4ed0e24f773b0847d2dff7ab975fa4b34e14b03b54f1"} Oct 12 21:45:29 crc kubenswrapper[4773]: I1012 21:45:29.572608 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerStarted","Data":"53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46"} Oct 12 21:45:29 crc kubenswrapper[4773]: I1012 21:45:29.572645 4773 scope.go:117] "RemoveContainer" containerID="28088f0a12d087c134365b1148b70ea2ac0084b926ca9db657d24e2c2fa9091b" Oct 12 21:45:29 crc kubenswrapper[4773]: I1012 21:45:29.660900 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-rqvz2_ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a/manager/0.log" Oct 12 21:45:29 crc kubenswrapper[4773]: I1012 21:45:29.678667 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-rqvz2_ea9f5c3c-ab2a-49b6-aa9c-5ea540b1df9a/kube-rbac-proxy/0.log" Oct 12 21:45:29 crc kubenswrapper[4773]: I1012 21:45:29.830427 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/util/0.log" Oct 12 21:45:29 crc kubenswrapper[4773]: I1012 21:45:29.981147 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/pull/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.013466 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/util/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.043109 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/pull/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.171325 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/util/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.215891 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/extract/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.221282 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365br5vtr_38982daf-184a-4b70-b9d5-f37c23f908f2/pull/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.405104 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-4wwwj_e3a81848-dc85-44b3-addf-35cb34c1e85a/kube-rbac-proxy/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.498112 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-sfgw7_69dd4207-8b02-4a43-bc3a-9c939881422f/kube-rbac-proxy/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.559220 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-4wwwj_e3a81848-dc85-44b3-addf-35cb34c1e85a/manager/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.625355 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-sfgw7_69dd4207-8b02-4a43-bc3a-9c939881422f/manager/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.760676 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-llrqr_3bff7ce4-adb2-494b-8644-f8e7568efa62/kube-rbac-proxy/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.850589 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-llrqr_3bff7ce4-adb2-494b-8644-f8e7568efa62/manager/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.900157 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-xnqzt_e963f42c-7955-4378-927e-1ab264a6116e/kube-rbac-proxy/0.log" Oct 12 21:45:30 crc kubenswrapper[4773]: I1012 21:45:30.990003 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-xnqzt_e963f42c-7955-4378-927e-1ab264a6116e/manager/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.117025 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-shp22_9392e042-5a5f-47d2-9232-3fa47cce88f3/manager/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.135962 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-shp22_9392e042-5a5f-47d2-9232-3fa47cce88f3/kube-rbac-proxy/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.338473 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-8j4jx_6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17/kube-rbac-proxy/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.478951 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-8j4jx_6fd8fe9c-3d34-4d22-8bc0-1536aa8b2e17/manager/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.493916 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-fqqwc_83700a3c-4ccd-4ac6-8c0a-c530623ffdfe/kube-rbac-proxy/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.590108 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-fqqwc_83700a3c-4ccd-4ac6-8c0a-c530623ffdfe/manager/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.699437 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-sc6z2_5321f2fd-a14c-4a48-be68-bdbefe80aa8d/kube-rbac-proxy/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.813709 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-sc6z2_5321f2fd-a14c-4a48-be68-bdbefe80aa8d/manager/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.888275 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-jh557_2e680c12-2026-4296-8ffa-d0185c12d2c1/kube-rbac-proxy/0.log" Oct 12 21:45:31 crc kubenswrapper[4773]: I1012 21:45:31.968410 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-jh557_2e680c12-2026-4296-8ffa-d0185c12d2c1/manager/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.107176 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-dtppq_843b5d05-f35d-4632-8781-4c60ed803cb6/kube-rbac-proxy/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.132865 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-dtppq_843b5d05-f35d-4632-8781-4c60ed803cb6/manager/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.266610 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-b6kht_7dc3b970-233d-4af3-a341-8297af5433bc/kube-rbac-proxy/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.357340 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-b6kht_7dc3b970-233d-4af3-a341-8297af5433bc/manager/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.421667 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-pbmbc_f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b/kube-rbac-proxy/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.544185 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-pbmbc_f43b4c25-e5cf-4865-8ad2-9395e9e6cf1b/manager/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.698424 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-nvvg8_095b027c-fb46-4d19-bbcf-84871f8c90f7/kube-rbac-proxy/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.718263 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-nvvg8_095b027c-fb46-4d19-bbcf-84871f8c90f7/manager/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.857058 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf_b9e5880b-293d-4311-8928-f93649649c93/kube-rbac-proxy/0.log" Oct 12 21:45:32 crc kubenswrapper[4773]: I1012 21:45:32.945042 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7bnrhzf_b9e5880b-293d-4311-8928-f93649649c93/manager/0.log" Oct 12 21:45:33 crc kubenswrapper[4773]: I1012 21:45:33.088152 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-xgrzm_094825fc-aaad-4717-9d34-426f1f3fa63f/kube-rbac-proxy/0.log" Oct 12 21:45:33 crc kubenswrapper[4773]: I1012 21:45:33.201303 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-qgbcw_96a44ad1-ead6-4c4d-be23-622d643a0bf0/kube-rbac-proxy/0.log" Oct 12 21:45:33 crc kubenswrapper[4773]: I1012 21:45:33.448326 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-qgbcw_96a44ad1-ead6-4c4d-be23-622d643a0bf0/operator/0.log" Oct 12 21:45:33 crc kubenswrapper[4773]: I1012 21:45:33.491523 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5p2sl_237d38ea-2958-4510-a3e3-20b37bf0814d/registry-server/0.log" Oct 12 21:45:33 crc kubenswrapper[4773]: I1012 21:45:33.748267 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-rczp5_34c81f6e-1829-4f0f-a0aa-951b4d4f41c4/kube-rbac-proxy/0.log" Oct 12 21:45:33 crc kubenswrapper[4773]: I1012 21:45:33.875357 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-rczp5_34c81f6e-1829-4f0f-a0aa-951b4d4f41c4/manager/0.log" Oct 12 21:45:33 crc kubenswrapper[4773]: I1012 21:45:33.928420 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-thj2w_2b083bd3-8fe4-44c8-8d3e-f736260b8210/kube-rbac-proxy/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.053659 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-thj2w_2b083bd3-8fe4-44c8-8d3e-f736260b8210/manager/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.253276 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-8j8bm_f7349c73-c56a-4e87-8618-dea521d99b95/operator/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.352445 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-vx9cr_b2ec8f8f-d841-4683-86ed-54ec360d9ec1/kube-rbac-proxy/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.403802 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-xgrzm_094825fc-aaad-4717-9d34-426f1f3fa63f/manager/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.434383 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-vx9cr_b2ec8f8f-d841-4683-86ed-54ec360d9ec1/manager/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.464736 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-bcdc4_293153be-33db-41ba-a589-55a17026c756/kube-rbac-proxy/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.608042 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-bcdc4_293153be-33db-41ba-a589-55a17026c756/manager/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.640089 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-hz6mw_e38708a6-e3b7-407d-8fe5-f27cd9a69f76/kube-rbac-proxy/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.660950 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-hz6mw_e38708a6-e3b7-407d-8fe5-f27cd9a69f76/manager/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.820405 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-4dsfj_38cef8bd-b25e-47aa-8f3f-9af1289f72f8/kube-rbac-proxy/0.log" Oct 12 21:45:34 crc kubenswrapper[4773]: I1012 21:45:34.871099 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-4dsfj_38cef8bd-b25e-47aa-8f3f-9af1289f72f8/manager/0.log" Oct 12 21:45:47 crc kubenswrapper[4773]: I1012 21:45:47.734776 4773 scope.go:117] "RemoveContainer" containerID="bc0d1d6616d9f4070dec2f76f0318fdfd692c664772e2fc717ac9008d55fa187" Oct 12 21:45:51 crc kubenswrapper[4773]: I1012 21:45:51.975974 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-89dsq_fa851b59-ffb3-46c4-a61e-31f85d43eb7a/control-plane-machine-set-operator/0.log" Oct 12 21:45:52 crc kubenswrapper[4773]: I1012 21:45:52.156912 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmsrw_bf381381-f5d3-4217-8a9c-cf527e2c6c65/kube-rbac-proxy/0.log" Oct 12 21:45:52 crc kubenswrapper[4773]: I1012 21:45:52.192947 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmsrw_bf381381-f5d3-4217-8a9c-cf527e2c6c65/machine-api-operator/0.log" Oct 12 21:46:05 crc kubenswrapper[4773]: I1012 21:46:05.621403 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qbzg6_6003117d-518b-4b81-98ba-01ffbdea09c7/cert-manager-controller/0.log" Oct 12 21:46:05 crc kubenswrapper[4773]: I1012 21:46:05.781323 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-lshqf_5c1610be-cf14-4659-8bf8-46cbcb55aa47/cert-manager-cainjector/0.log" Oct 12 21:46:05 crc kubenswrapper[4773]: I1012 21:46:05.855591 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-gh5rv_882eaacb-03d9-4250-ab13-b702c4f4b91c/cert-manager-webhook/0.log" Oct 12 21:46:17 crc kubenswrapper[4773]: I1012 21:46:17.128926 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-77gl5_476898a0-6b77-4b46-8a73-1a0fa1e336c8/nmstate-console-plugin/0.log" Oct 12 21:46:17 crc kubenswrapper[4773]: I1012 21:46:17.300923 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gpbvq_04007580-35e5-42d5-84ec-1e44c4d6d914/nmstate-handler/0.log" Oct 12 21:46:17 crc kubenswrapper[4773]: I1012 21:46:17.426180 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-8mnws_751cb256-7079-497a-a027-a9c295bc9832/kube-rbac-proxy/0.log" Oct 12 21:46:17 crc kubenswrapper[4773]: I1012 21:46:17.448953 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-8mnws_751cb256-7079-497a-a027-a9c295bc9832/nmstate-metrics/0.log" Oct 12 21:46:17 crc kubenswrapper[4773]: I1012 21:46:17.694080 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-5qz2w_c290e672-35df-4626-8034-095052214269/nmstate-operator/0.log" Oct 12 21:46:17 crc kubenswrapper[4773]: I1012 21:46:17.748035 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-wzw64_fdf1901d-c523-4385-9415-fae96f1ea74c/nmstate-webhook/0.log" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.784491 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p685w"] Oct 12 21:46:20 crc kubenswrapper[4773]: E1012 21:46:20.786434 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerName="extract-content" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.786514 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerName="extract-content" Oct 12 21:46:20 crc kubenswrapper[4773]: E1012 21:46:20.786589 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerName="registry-server" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.786645 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerName="registry-server" Oct 12 21:46:20 crc kubenswrapper[4773]: E1012 21:46:20.786741 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc285e1e-8a4a-4a1e-96ff-37de788c9087" containerName="collect-profiles" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.786827 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc285e1e-8a4a-4a1e-96ff-37de788c9087" containerName="collect-profiles" Oct 12 21:46:20 crc kubenswrapper[4773]: E1012 21:46:20.786897 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerName="extract-utilities" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.786964 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerName="extract-utilities" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.787221 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9d88d7-06ef-4a7e-89d7-67009fdacd29" containerName="registry-server" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.787302 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc285e1e-8a4a-4a1e-96ff-37de788c9087" containerName="collect-profiles" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.789085 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.799111 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p685w"] Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.928326 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-catalog-content\") pod \"redhat-operators-p685w\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.928711 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5j98\" (UniqueName: \"kubernetes.io/projected/b76ecc2a-e477-462f-8bab-9630db547ec2-kube-api-access-m5j98\") pod \"redhat-operators-p685w\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:20 crc kubenswrapper[4773]: I1012 21:46:20.928799 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-utilities\") pod \"redhat-operators-p685w\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:21 crc kubenswrapper[4773]: I1012 21:46:21.031185 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-utilities\") pod \"redhat-operators-p685w\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:21 crc kubenswrapper[4773]: I1012 21:46:21.031344 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-catalog-content\") pod \"redhat-operators-p685w\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:21 crc kubenswrapper[4773]: I1012 21:46:21.031389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5j98\" (UniqueName: \"kubernetes.io/projected/b76ecc2a-e477-462f-8bab-9630db547ec2-kube-api-access-m5j98\") pod \"redhat-operators-p685w\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:21 crc kubenswrapper[4773]: I1012 21:46:21.031834 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-utilities\") pod \"redhat-operators-p685w\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:21 crc kubenswrapper[4773]: I1012 21:46:21.032159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-catalog-content\") pod \"redhat-operators-p685w\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:21 crc kubenswrapper[4773]: I1012 21:46:21.058913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5j98\" (UniqueName: \"kubernetes.io/projected/b76ecc2a-e477-462f-8bab-9630db547ec2-kube-api-access-m5j98\") pod \"redhat-operators-p685w\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:21 crc kubenswrapper[4773]: I1012 21:46:21.121763 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:21 crc kubenswrapper[4773]: I1012 21:46:21.656855 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p685w"] Oct 12 21:46:23 crc kubenswrapper[4773]: I1012 21:46:23.021677 4773 generic.go:334] "Generic (PLEG): container finished" podID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerID="e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb" exitCode=0 Oct 12 21:46:23 crc kubenswrapper[4773]: I1012 21:46:23.021783 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p685w" event={"ID":"b76ecc2a-e477-462f-8bab-9630db547ec2","Type":"ContainerDied","Data":"e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb"} Oct 12 21:46:23 crc kubenswrapper[4773]: I1012 21:46:23.022219 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p685w" event={"ID":"b76ecc2a-e477-462f-8bab-9630db547ec2","Type":"ContainerStarted","Data":"7e77671914d4df05655b77b35d7591f48a309e46329bd60cf68461404650d2c5"} Oct 12 21:46:25 crc kubenswrapper[4773]: I1012 21:46:25.039465 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p685w" event={"ID":"b76ecc2a-e477-462f-8bab-9630db547ec2","Type":"ContainerStarted","Data":"8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61"} Oct 12 21:46:28 crc kubenswrapper[4773]: I1012 21:46:28.078567 4773 generic.go:334] "Generic (PLEG): container finished" podID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerID="8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61" exitCode=0 Oct 12 21:46:28 crc kubenswrapper[4773]: I1012 21:46:28.078666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p685w" event={"ID":"b76ecc2a-e477-462f-8bab-9630db547ec2","Type":"ContainerDied","Data":"8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61"} Oct 12 21:46:29 crc kubenswrapper[4773]: I1012 21:46:29.090860 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p685w" event={"ID":"b76ecc2a-e477-462f-8bab-9630db547ec2","Type":"ContainerStarted","Data":"210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb"} Oct 12 21:46:29 crc kubenswrapper[4773]: I1012 21:46:29.115047 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p685w" podStartSLOduration=3.564645508 podStartE2EDuration="9.115032268s" podCreationTimestamp="2025-10-12 21:46:20 +0000 UTC" firstStartedPulling="2025-10-12 21:46:23.024314126 +0000 UTC m=+4931.260612696" lastFinishedPulling="2025-10-12 21:46:28.574700856 +0000 UTC m=+4936.810999456" observedRunningTime="2025-10-12 21:46:29.111559032 +0000 UTC m=+4937.347857592" watchObservedRunningTime="2025-10-12 21:46:29.115032268 +0000 UTC m=+4937.351330828" Oct 12 21:46:31 crc kubenswrapper[4773]: I1012 21:46:31.123587 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:31 crc kubenswrapper[4773]: I1012 21:46:31.123948 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:32 crc kubenswrapper[4773]: I1012 21:46:32.460946 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p685w" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerName="registry-server" probeResult="failure" output=< Oct 12 21:46:32 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Oct 12 21:46:32 crc kubenswrapper[4773]: > Oct 12 21:46:33 crc kubenswrapper[4773]: I1012 21:46:33.556790 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-mcbq6_9c120d38-3572-486b-9b37-946d2358e130/kube-rbac-proxy/0.log" Oct 12 21:46:33 crc kubenswrapper[4773]: I1012 21:46:33.696904 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-mcbq6_9c120d38-3572-486b-9b37-946d2358e130/controller/0.log" Oct 12 21:46:33 crc kubenswrapper[4773]: I1012 21:46:33.862018 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-frr-files/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.048712 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-frr-files/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.112415 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-reloader/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.142304 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-metrics/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.157829 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-reloader/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.398300 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-metrics/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.406961 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-frr-files/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.441200 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-reloader/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.445963 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-metrics/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.654694 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-frr-files/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.656955 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-reloader/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.685623 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/cp-metrics/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.690456 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/controller/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.883230 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/kube-rbac-proxy/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.883769 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/kube-rbac-proxy-frr/0.log" Oct 12 21:46:34 crc kubenswrapper[4773]: I1012 21:46:34.890068 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/frr-metrics/0.log" Oct 12 21:46:35 crc kubenswrapper[4773]: I1012 21:46:35.154626 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/reloader/0.log" Oct 12 21:46:35 crc kubenswrapper[4773]: I1012 21:46:35.179388 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-sq7f5_aba7a037-467a-40bd-b2e5-4c446be76185/frr-k8s-webhook-server/0.log" Oct 12 21:46:35 crc kubenswrapper[4773]: I1012 21:46:35.465319 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d86f779f8-r94wm_774b15ab-55ba-42a6-8a77-13690e6aa683/manager/0.log" Oct 12 21:46:35 crc kubenswrapper[4773]: I1012 21:46:35.770067 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d8b4c7c74-pbqqx_180f9b25-f871-4854-b535-73fd6bd1d7f0/webhook-server/0.log" Oct 12 21:46:35 crc kubenswrapper[4773]: I1012 21:46:35.837502 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-df4jg_5c70a42a-d5f5-4b1d-b23b-cd672597789c/kube-rbac-proxy/0.log" Oct 12 21:46:36 crc kubenswrapper[4773]: I1012 21:46:36.488546 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-clw9f_d895af47-6572-42a4-805b-56be09e5e40c/frr/0.log" Oct 12 21:46:36 crc kubenswrapper[4773]: I1012 21:46:36.604689 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-df4jg_5c70a42a-d5f5-4b1d-b23b-cd672597789c/speaker/0.log" Oct 12 21:46:41 crc kubenswrapper[4773]: I1012 21:46:41.173240 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:41 crc kubenswrapper[4773]: I1012 21:46:41.241846 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:41 crc kubenswrapper[4773]: I1012 21:46:41.409149 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p685w"] Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.212241 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p685w" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerName="registry-server" containerID="cri-o://210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb" gracePeriod=2 Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.703283 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.876026 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-catalog-content\") pod \"b76ecc2a-e477-462f-8bab-9630db547ec2\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.877152 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5j98\" (UniqueName: \"kubernetes.io/projected/b76ecc2a-e477-462f-8bab-9630db547ec2-kube-api-access-m5j98\") pod \"b76ecc2a-e477-462f-8bab-9630db547ec2\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.877349 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-utilities\") pod \"b76ecc2a-e477-462f-8bab-9630db547ec2\" (UID: \"b76ecc2a-e477-462f-8bab-9630db547ec2\") " Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.878326 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-utilities" (OuterVolumeSpecName: "utilities") pod "b76ecc2a-e477-462f-8bab-9630db547ec2" (UID: "b76ecc2a-e477-462f-8bab-9630db547ec2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.879133 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.892217 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76ecc2a-e477-462f-8bab-9630db547ec2-kube-api-access-m5j98" (OuterVolumeSpecName: "kube-api-access-m5j98") pod "b76ecc2a-e477-462f-8bab-9630db547ec2" (UID: "b76ecc2a-e477-462f-8bab-9630db547ec2"). InnerVolumeSpecName "kube-api-access-m5j98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.961112 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b76ecc2a-e477-462f-8bab-9630db547ec2" (UID: "b76ecc2a-e477-462f-8bab-9630db547ec2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.981313 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5j98\" (UniqueName: \"kubernetes.io/projected/b76ecc2a-e477-462f-8bab-9630db547ec2-kube-api-access-m5j98\") on node \"crc\" DevicePath \"\"" Oct 12 21:46:42 crc kubenswrapper[4773]: I1012 21:46:42.981354 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76ecc2a-e477-462f-8bab-9630db547ec2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.230709 4773 generic.go:334] "Generic (PLEG): container finished" podID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerID="210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb" exitCode=0 Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.230764 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p685w" event={"ID":"b76ecc2a-e477-462f-8bab-9630db547ec2","Type":"ContainerDied","Data":"210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb"} Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.230788 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p685w" event={"ID":"b76ecc2a-e477-462f-8bab-9630db547ec2","Type":"ContainerDied","Data":"7e77671914d4df05655b77b35d7591f48a309e46329bd60cf68461404650d2c5"} Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.230805 4773 scope.go:117] "RemoveContainer" containerID="210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.230919 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p685w" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.266016 4773 scope.go:117] "RemoveContainer" containerID="8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.266149 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p685w"] Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.272440 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p685w"] Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.287313 4773 scope.go:117] "RemoveContainer" containerID="e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.336647 4773 scope.go:117] "RemoveContainer" containerID="210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb" Oct 12 21:46:43 crc kubenswrapper[4773]: E1012 21:46:43.337083 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb\": container with ID starting with 210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb not found: ID does not exist" containerID="210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.337122 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb"} err="failed to get container status \"210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb\": rpc error: code = NotFound desc = could not find container \"210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb\": container with ID starting with 210ba589721fda6ade76e1ee14fd540ec7b53770c96c5fb0d675516a49085beb not found: ID does not exist" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.337150 4773 scope.go:117] "RemoveContainer" containerID="8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61" Oct 12 21:46:43 crc kubenswrapper[4773]: E1012 21:46:43.337426 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61\": container with ID starting with 8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61 not found: ID does not exist" containerID="8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.337460 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61"} err="failed to get container status \"8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61\": rpc error: code = NotFound desc = could not find container \"8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61\": container with ID starting with 8b321f0e8df9b93dc4470958eee448bfbfe0ad92a4a33e2754d2668788a82a61 not found: ID does not exist" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.337481 4773 scope.go:117] "RemoveContainer" containerID="e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb" Oct 12 21:46:43 crc kubenswrapper[4773]: E1012 21:46:43.337848 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb\": container with ID starting with e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb not found: ID does not exist" containerID="e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb" Oct 12 21:46:43 crc kubenswrapper[4773]: I1012 21:46:43.337878 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb"} err="failed to get container status \"e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb\": rpc error: code = NotFound desc = could not find container \"e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb\": container with ID starting with e1b270fb322eb64d2ace4bcd81f6204f4c2ddc23f43c6135040559c3e4a150eb not found: ID does not exist" Oct 12 21:46:44 crc kubenswrapper[4773]: I1012 21:46:44.495157 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" path="/var/lib/kubelet/pods/b76ecc2a-e477-462f-8bab-9630db547ec2/volumes" Oct 12 21:46:49 crc kubenswrapper[4773]: I1012 21:46:49.571192 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/util/0.log" Oct 12 21:46:49 crc kubenswrapper[4773]: I1012 21:46:49.783553 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/util/0.log" Oct 12 21:46:49 crc kubenswrapper[4773]: I1012 21:46:49.810913 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/pull/0.log" Oct 12 21:46:49 crc kubenswrapper[4773]: I1012 21:46:49.826589 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/pull/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.046789 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/pull/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.068757 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/util/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.069796 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22bj2j_e282a672-4919-479e-9bdc-796dc2986e33/extract/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.217206 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-utilities/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.435767 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-content/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.445100 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-content/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.483000 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-utilities/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.639677 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-content/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.644322 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/extract-utilities/0.log" Oct 12 21:46:50 crc kubenswrapper[4773]: I1012 21:46:50.885562 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-utilities/0.log" Oct 12 21:46:51 crc kubenswrapper[4773]: I1012 21:46:51.105470 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bnf6k_645a380b-fa47-45e2-a370-164d09e3a646/registry-server/0.log" Oct 12 21:46:51 crc kubenswrapper[4773]: I1012 21:46:51.139178 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-content/0.log" Oct 12 21:46:51 crc kubenswrapper[4773]: I1012 21:46:51.215837 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-content/0.log" Oct 12 21:46:51 crc kubenswrapper[4773]: I1012 21:46:51.218267 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-utilities/0.log" Oct 12 21:46:51 crc kubenswrapper[4773]: I1012 21:46:51.392693 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-utilities/0.log" Oct 12 21:46:51 crc kubenswrapper[4773]: I1012 21:46:51.393263 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/extract-content/0.log" Oct 12 21:46:51 crc kubenswrapper[4773]: I1012 21:46:51.684735 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/util/0.log" Oct 12 21:46:51 crc kubenswrapper[4773]: I1012 21:46:51.963972 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/pull/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.005029 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr5bp_b7565fd8-a54f-4a89-8162-633feec6e76f/registry-server/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.046593 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/util/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.097095 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/pull/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.208391 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/util/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.220448 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/pull/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.298641 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxgvfh_2aa293c3-6ed8-473b-b2dd-ec4a0515d08f/extract/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.378168 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2v2pc_3f993aa7-e2c9-41bb-96ba-0b4e0682c92a/marketplace-operator/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.520976 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-utilities/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.731115 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-content/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.746090 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-content/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.788446 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-utilities/0.log" Oct 12 21:46:52 crc kubenswrapper[4773]: I1012 21:46:52.996322 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-content/0.log" Oct 12 21:46:53 crc kubenswrapper[4773]: I1012 21:46:53.011834 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/extract-utilities/0.log" Oct 12 21:46:53 crc kubenswrapper[4773]: I1012 21:46:53.172411 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-utilities/0.log" Oct 12 21:46:53 crc kubenswrapper[4773]: I1012 21:46:53.241948 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgmbc_5bef1d5c-305a-457b-8d9d-d22b1d65d077/registry-server/0.log" Oct 12 21:46:53 crc kubenswrapper[4773]: I1012 21:46:53.429321 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-content/0.log" Oct 12 21:46:53 crc kubenswrapper[4773]: I1012 21:46:53.470272 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-content/0.log" Oct 12 21:46:53 crc kubenswrapper[4773]: I1012 21:46:53.472091 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-utilities/0.log" Oct 12 21:46:53 crc kubenswrapper[4773]: I1012 21:46:53.622959 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-content/0.log" Oct 12 21:46:53 crc kubenswrapper[4773]: I1012 21:46:53.643085 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/extract-utilities/0.log" Oct 12 21:46:54 crc kubenswrapper[4773]: I1012 21:46:54.215850 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j7pqm_88fd9e23-9895-4ff5-a626-c695ec043315/registry-server/0.log" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.333848 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vbd98"] Oct 12 21:47:25 crc kubenswrapper[4773]: E1012 21:47:25.334738 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerName="extract-utilities" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.334750 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerName="extract-utilities" Oct 12 21:47:25 crc kubenswrapper[4773]: E1012 21:47:25.334763 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerName="registry-server" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.334771 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerName="registry-server" Oct 12 21:47:25 crc kubenswrapper[4773]: E1012 21:47:25.334783 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerName="extract-content" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.334789 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerName="extract-content" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.334984 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76ecc2a-e477-462f-8bab-9630db547ec2" containerName="registry-server" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.336323 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.357933 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbd98"] Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.423820 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-utilities\") pod \"community-operators-vbd98\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.424138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hbx7\" (UniqueName: \"kubernetes.io/projected/d38fa3af-e175-426e-a4a5-431d3bc3a50a-kube-api-access-5hbx7\") pod \"community-operators-vbd98\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.424210 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-catalog-content\") pod \"community-operators-vbd98\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.525840 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-utilities\") pod \"community-operators-vbd98\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.526006 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hbx7\" (UniqueName: \"kubernetes.io/projected/d38fa3af-e175-426e-a4a5-431d3bc3a50a-kube-api-access-5hbx7\") pod \"community-operators-vbd98\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.526032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-catalog-content\") pod \"community-operators-vbd98\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.526345 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-utilities\") pod \"community-operators-vbd98\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.526473 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-catalog-content\") pod \"community-operators-vbd98\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.552661 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hbx7\" (UniqueName: \"kubernetes.io/projected/d38fa3af-e175-426e-a4a5-431d3bc3a50a-kube-api-access-5hbx7\") pod \"community-operators-vbd98\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:25 crc kubenswrapper[4773]: I1012 21:47:25.652780 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:26 crc kubenswrapper[4773]: I1012 21:47:26.255950 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbd98"] Oct 12 21:47:26 crc kubenswrapper[4773]: I1012 21:47:26.590563 4773 generic.go:334] "Generic (PLEG): container finished" podID="d38fa3af-e175-426e-a4a5-431d3bc3a50a" containerID="1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300" exitCode=0 Oct 12 21:47:26 crc kubenswrapper[4773]: I1012 21:47:26.590608 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbd98" event={"ID":"d38fa3af-e175-426e-a4a5-431d3bc3a50a","Type":"ContainerDied","Data":"1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300"} Oct 12 21:47:26 crc kubenswrapper[4773]: I1012 21:47:26.590889 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbd98" event={"ID":"d38fa3af-e175-426e-a4a5-431d3bc3a50a","Type":"ContainerStarted","Data":"02f15d83b447d7e6aef67b994341cd68344a87750efefdebe96c375535ab7e6a"} Oct 12 21:47:27 crc kubenswrapper[4773]: I1012 21:47:27.602339 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbd98" event={"ID":"d38fa3af-e175-426e-a4a5-431d3bc3a50a","Type":"ContainerStarted","Data":"ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1"} Oct 12 21:47:28 crc kubenswrapper[4773]: I1012 21:47:28.611184 4773 generic.go:334] "Generic (PLEG): container finished" podID="d38fa3af-e175-426e-a4a5-431d3bc3a50a" containerID="ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1" exitCode=0 Oct 12 21:47:28 crc kubenswrapper[4773]: I1012 21:47:28.611473 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbd98" event={"ID":"d38fa3af-e175-426e-a4a5-431d3bc3a50a","Type":"ContainerDied","Data":"ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1"} Oct 12 21:47:29 crc kubenswrapper[4773]: I1012 21:47:29.653453 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbd98" event={"ID":"d38fa3af-e175-426e-a4a5-431d3bc3a50a","Type":"ContainerStarted","Data":"0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b"} Oct 12 21:47:29 crc kubenswrapper[4773]: I1012 21:47:29.696597 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vbd98" podStartSLOduration=2.23962185 podStartE2EDuration="4.696579344s" podCreationTimestamp="2025-10-12 21:47:25 +0000 UTC" firstStartedPulling="2025-10-12 21:47:26.593676827 +0000 UTC m=+4994.829975397" lastFinishedPulling="2025-10-12 21:47:29.050634321 +0000 UTC m=+4997.286932891" observedRunningTime="2025-10-12 21:47:29.680110149 +0000 UTC m=+4997.916408709" watchObservedRunningTime="2025-10-12 21:47:29.696579344 +0000 UTC m=+4997.932877904" Oct 12 21:47:35 crc kubenswrapper[4773]: I1012 21:47:35.653845 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:35 crc kubenswrapper[4773]: I1012 21:47:35.654638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:35 crc kubenswrapper[4773]: I1012 21:47:35.730369 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:35 crc kubenswrapper[4773]: I1012 21:47:35.815464 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:35 crc kubenswrapper[4773]: I1012 21:47:35.985314 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbd98"] Oct 12 21:47:37 crc kubenswrapper[4773]: I1012 21:47:37.746431 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vbd98" podUID="d38fa3af-e175-426e-a4a5-431d3bc3a50a" containerName="registry-server" containerID="cri-o://0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b" gracePeriod=2 Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.217521 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.268503 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-catalog-content\") pod \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.268617 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hbx7\" (UniqueName: \"kubernetes.io/projected/d38fa3af-e175-426e-a4a5-431d3bc3a50a-kube-api-access-5hbx7\") pod \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.268704 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-utilities\") pod \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\" (UID: \"d38fa3af-e175-426e-a4a5-431d3bc3a50a\") " Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.275834 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-utilities" (OuterVolumeSpecName: "utilities") pod "d38fa3af-e175-426e-a4a5-431d3bc3a50a" (UID: "d38fa3af-e175-426e-a4a5-431d3bc3a50a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.300252 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38fa3af-e175-426e-a4a5-431d3bc3a50a-kube-api-access-5hbx7" (OuterVolumeSpecName: "kube-api-access-5hbx7") pod "d38fa3af-e175-426e-a4a5-431d3bc3a50a" (UID: "d38fa3af-e175-426e-a4a5-431d3bc3a50a"). InnerVolumeSpecName "kube-api-access-5hbx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.372119 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.372162 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hbx7\" (UniqueName: \"kubernetes.io/projected/d38fa3af-e175-426e-a4a5-431d3bc3a50a-kube-api-access-5hbx7\") on node \"crc\" DevicePath \"\"" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.482601 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d38fa3af-e175-426e-a4a5-431d3bc3a50a" (UID: "d38fa3af-e175-426e-a4a5-431d3bc3a50a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.575315 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38fa3af-e175-426e-a4a5-431d3bc3a50a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.760539 4773 generic.go:334] "Generic (PLEG): container finished" podID="d38fa3af-e175-426e-a4a5-431d3bc3a50a" containerID="0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b" exitCode=0 Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.760579 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbd98" event={"ID":"d38fa3af-e175-426e-a4a5-431d3bc3a50a","Type":"ContainerDied","Data":"0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b"} Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.760604 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbd98" event={"ID":"d38fa3af-e175-426e-a4a5-431d3bc3a50a","Type":"ContainerDied","Data":"02f15d83b447d7e6aef67b994341cd68344a87750efefdebe96c375535ab7e6a"} Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.760622 4773 scope.go:117] "RemoveContainer" containerID="0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.760679 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbd98" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.793700 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbd98"] Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.808142 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vbd98"] Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.821447 4773 scope.go:117] "RemoveContainer" containerID="ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.847462 4773 scope.go:117] "RemoveContainer" containerID="1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.885892 4773 scope.go:117] "RemoveContainer" containerID="0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b" Oct 12 21:47:38 crc kubenswrapper[4773]: E1012 21:47:38.886294 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b\": container with ID starting with 0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b not found: ID does not exist" containerID="0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.886343 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b"} err="failed to get container status \"0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b\": rpc error: code = NotFound desc = could not find container \"0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b\": container with ID starting with 0f98178a069c6caa7e9d89be3f60b9ab0152325524870503feafda00d067a70b not found: ID does not exist" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.886371 4773 scope.go:117] "RemoveContainer" containerID="ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1" Oct 12 21:47:38 crc kubenswrapper[4773]: E1012 21:47:38.886709 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1\": container with ID starting with ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1 not found: ID does not exist" containerID="ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.886760 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1"} err="failed to get container status \"ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1\": rpc error: code = NotFound desc = could not find container \"ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1\": container with ID starting with ec9dafef143f79e5b6d5f979bd7df7105039617fafd0b8956e577cdb8ada88c1 not found: ID does not exist" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.886787 4773 scope.go:117] "RemoveContainer" containerID="1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300" Oct 12 21:47:38 crc kubenswrapper[4773]: E1012 21:47:38.887054 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300\": container with ID starting with 1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300 not found: ID does not exist" containerID="1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300" Oct 12 21:47:38 crc kubenswrapper[4773]: I1012 21:47:38.887079 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300"} err="failed to get container status \"1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300\": rpc error: code = NotFound desc = could not find container \"1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300\": container with ID starting with 1f39d80cded54772b63f479986b5989c3161f27525d1261b63989247c65c7300 not found: ID does not exist" Oct 12 21:47:40 crc kubenswrapper[4773]: I1012 21:47:40.505150 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38fa3af-e175-426e-a4a5-431d3bc3a50a" path="/var/lib/kubelet/pods/d38fa3af-e175-426e-a4a5-431d3bc3a50a/volumes" Oct 12 21:47:58 crc kubenswrapper[4773]: I1012 21:47:58.669986 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:47:58 crc kubenswrapper[4773]: I1012 21:47:58.670590 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:48:28 crc kubenswrapper[4773]: I1012 21:48:28.669930 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:48:28 crc kubenswrapper[4773]: I1012 21:48:28.670418 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:48:58 crc kubenswrapper[4773]: I1012 21:48:58.670012 4773 patch_prober.go:28] interesting pod/machine-config-daemon-cbx9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 21:48:58 crc kubenswrapper[4773]: I1012 21:48:58.670449 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 21:48:58 crc kubenswrapper[4773]: I1012 21:48:58.670493 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" Oct 12 21:48:58 crc kubenswrapper[4773]: I1012 21:48:58.671227 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46"} pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 21:48:58 crc kubenswrapper[4773]: I1012 21:48:58.671280 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerName="machine-config-daemon" containerID="cri-o://53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" gracePeriod=600 Oct 12 21:48:58 crc kubenswrapper[4773]: E1012 21:48:58.808493 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:48:59 crc kubenswrapper[4773]: I1012 21:48:59.572959 4773 generic.go:334] "Generic (PLEG): container finished" podID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" exitCode=0 Oct 12 21:48:59 crc kubenswrapper[4773]: I1012 21:48:59.573000 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" event={"ID":"c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f","Type":"ContainerDied","Data":"53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46"} Oct 12 21:48:59 crc kubenswrapper[4773]: I1012 21:48:59.573030 4773 scope.go:117] "RemoveContainer" containerID="79231d88867177fbbffb4ed0e24f773b0847d2dff7ab975fa4b34e14b03b54f1" Oct 12 21:48:59 crc kubenswrapper[4773]: I1012 21:48:59.574174 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:48:59 crc kubenswrapper[4773]: E1012 21:48:59.574706 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:49:09 crc kubenswrapper[4773]: I1012 21:49:09.686984 4773 generic.go:334] "Generic (PLEG): container finished" podID="74aeb93f-5898-4391-9fdc-555e496fcb91" containerID="b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0" exitCode=0 Oct 12 21:49:09 crc kubenswrapper[4773]: I1012 21:49:09.687100 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" event={"ID":"74aeb93f-5898-4391-9fdc-555e496fcb91","Type":"ContainerDied","Data":"b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0"} Oct 12 21:49:09 crc kubenswrapper[4773]: I1012 21:49:09.688899 4773 scope.go:117] "RemoveContainer" containerID="b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0" Oct 12 21:49:10 crc kubenswrapper[4773]: I1012 21:49:10.124429 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6cmz9_must-gather-jx5dd_74aeb93f-5898-4391-9fdc-555e496fcb91/gather/0.log" Oct 12 21:49:14 crc kubenswrapper[4773]: I1012 21:49:14.482695 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:49:14 crc kubenswrapper[4773]: E1012 21:49:14.483593 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:49:22 crc kubenswrapper[4773]: I1012 21:49:22.680417 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6cmz9/must-gather-jx5dd"] Oct 12 21:49:22 crc kubenswrapper[4773]: I1012 21:49:22.681062 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" podUID="74aeb93f-5898-4391-9fdc-555e496fcb91" containerName="copy" containerID="cri-o://de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af" gracePeriod=2 Oct 12 21:49:22 crc kubenswrapper[4773]: I1012 21:49:22.691337 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6cmz9/must-gather-jx5dd"] Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.104434 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6cmz9_must-gather-jx5dd_74aeb93f-5898-4391-9fdc-555e496fcb91/copy/0.log" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.105055 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.271166 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7km9x\" (UniqueName: \"kubernetes.io/projected/74aeb93f-5898-4391-9fdc-555e496fcb91-kube-api-access-7km9x\") pod \"74aeb93f-5898-4391-9fdc-555e496fcb91\" (UID: \"74aeb93f-5898-4391-9fdc-555e496fcb91\") " Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.271340 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74aeb93f-5898-4391-9fdc-555e496fcb91-must-gather-output\") pod \"74aeb93f-5898-4391-9fdc-555e496fcb91\" (UID: \"74aeb93f-5898-4391-9fdc-555e496fcb91\") " Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.287935 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74aeb93f-5898-4391-9fdc-555e496fcb91-kube-api-access-7km9x" (OuterVolumeSpecName: "kube-api-access-7km9x") pod "74aeb93f-5898-4391-9fdc-555e496fcb91" (UID: "74aeb93f-5898-4391-9fdc-555e496fcb91"). InnerVolumeSpecName "kube-api-access-7km9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.382365 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7km9x\" (UniqueName: \"kubernetes.io/projected/74aeb93f-5898-4391-9fdc-555e496fcb91-kube-api-access-7km9x\") on node \"crc\" DevicePath \"\"" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.486145 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74aeb93f-5898-4391-9fdc-555e496fcb91-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "74aeb93f-5898-4391-9fdc-555e496fcb91" (UID: "74aeb93f-5898-4391-9fdc-555e496fcb91"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.585774 4773 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/74aeb93f-5898-4391-9fdc-555e496fcb91-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.817915 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6cmz9_must-gather-jx5dd_74aeb93f-5898-4391-9fdc-555e496fcb91/copy/0.log" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.818462 4773 generic.go:334] "Generic (PLEG): container finished" podID="74aeb93f-5898-4391-9fdc-555e496fcb91" containerID="de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af" exitCode=143 Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.818514 4773 scope.go:117] "RemoveContainer" containerID="de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.818684 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cmz9/must-gather-jx5dd" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.867798 4773 scope.go:117] "RemoveContainer" containerID="b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.907090 4773 scope.go:117] "RemoveContainer" containerID="de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af" Oct 12 21:49:23 crc kubenswrapper[4773]: E1012 21:49:23.909215 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af\": container with ID starting with de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af not found: ID does not exist" containerID="de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.909249 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af"} err="failed to get container status \"de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af\": rpc error: code = NotFound desc = could not find container \"de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af\": container with ID starting with de67d289bfb15edcf5f733efc4550c4f9cf6adc41cb9f93604eb7c6c83bcf2af not found: ID does not exist" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.909270 4773 scope.go:117] "RemoveContainer" containerID="b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0" Oct 12 21:49:23 crc kubenswrapper[4773]: E1012 21:49:23.909571 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0\": container with ID starting with b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0 not found: ID does not exist" containerID="b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0" Oct 12 21:49:23 crc kubenswrapper[4773]: I1012 21:49:23.909631 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0"} err="failed to get container status \"b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0\": rpc error: code = NotFound desc = could not find container \"b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0\": container with ID starting with b98da6e63f1dfc8eac528a06a0c846b5923b0300218c4935c14174cbd90ac3e0 not found: ID does not exist" Oct 12 21:49:24 crc kubenswrapper[4773]: I1012 21:49:24.491828 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74aeb93f-5898-4391-9fdc-555e496fcb91" path="/var/lib/kubelet/pods/74aeb93f-5898-4391-9fdc-555e496fcb91/volumes" Oct 12 21:49:27 crc kubenswrapper[4773]: I1012 21:49:27.481145 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:49:27 crc kubenswrapper[4773]: E1012 21:49:27.481879 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:49:40 crc kubenswrapper[4773]: I1012 21:49:40.482704 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:49:40 crc kubenswrapper[4773]: E1012 21:49:40.483821 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:49:47 crc kubenswrapper[4773]: I1012 21:49:47.921751 4773 scope.go:117] "RemoveContainer" containerID="9f84a81e4d39cb8d96c39573bb3f0d3ffdf9e412950b65f316f5fed13d348063" Oct 12 21:49:54 crc kubenswrapper[4773]: I1012 21:49:54.481079 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:49:54 crc kubenswrapper[4773]: E1012 21:49:54.482178 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:50:09 crc kubenswrapper[4773]: I1012 21:50:09.481840 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:50:09 crc kubenswrapper[4773]: E1012 21:50:09.483145 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:50:22 crc kubenswrapper[4773]: I1012 21:50:22.487631 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:50:22 crc kubenswrapper[4773]: E1012 21:50:22.488528 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:50:33 crc kubenswrapper[4773]: I1012 21:50:33.481658 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:50:33 crc kubenswrapper[4773]: E1012 21:50:33.482650 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:50:44 crc kubenswrapper[4773]: I1012 21:50:44.480991 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:50:44 crc kubenswrapper[4773]: E1012 21:50:44.481627 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f" Oct 12 21:50:59 crc kubenswrapper[4773]: I1012 21:50:59.481076 4773 scope.go:117] "RemoveContainer" containerID="53864a5d3481e59a488a56a26f5646881ac435014e275833162044632c238b46" Oct 12 21:50:59 crc kubenswrapper[4773]: E1012 21:50:59.481779 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cbx9j_openshift-machine-config-operator(c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-cbx9j" podUID="c4659ccb-e7e6-4c79-9f0b-5e8c3c2aad4f"